-
Types of ML: Supervised, Unsupervised, Reinforcement
-
Pipeline: Data Cleaning → Feature Selection → Modelling → Evaluation
-
Bias-Variance Trade-off
-
Train/Validation/Test splits
06:21
-
Cross-Validation
-
Linear Regression (with gradient descent math)
-
Math behind OLS Technique
04:32
-
Assumptions of Linear Regression
23:08
-
Evaluation Metrics: RMSE, MAE, R² Score
15:50
-
Polynomial Regression
-
Coding Linear Regression Model
-
Regularisation Techniques
08:56
-
Why Logistic Regression
04:27
-
Math Behind Logistic Regression
17:44
-
Evaluation Metrics Behind Classification Algorithms
17:10
-
Evaluation: Confusion Matrix, Accuracy, Precision, Recall, F1, AUC-ROC
-
Coding Logistic Regression Model
-
Introduction to Decision Tree
06:12
-
Intuition Behind Decision Tree
07:32
-
Math Behind Decision Tree
16:03
-
Math Behind Decision Tree using GINI
04:30
-
Drawbacks of Decision Tree
05:07
-
Random Forest, Gradient Boosting, XGBoost, LightGBM, CatBoost
-
Coding Decision Tree, Random Forest, GB
-
SMOTE Technique to Handle Imbalanced Dataset
06:35
-
Filter Method, Wrapper Method, Embedded Method – Feature Selection Techniques
08:47
-
Coding Feature Selection Techniques
-
Math Behind KNN
32:21
-
Math Behind SVM
47:20
-
Plotting SVM
-
Introduction to PCA
-
Math behind PCA
-
Coding PCA
-
AutoML Using Pycaret
-
Clustering: K-Means (Euclidean distance), Hierarchical Clustering
-
Dimensionality Reduction: PCA (Eigen decomposition), t-SNE
-
Anomaly Detection
-
Hyperparameter Tuning: GridSearchCV, RandomSearch
-
Pipelines with Scikit-learn