About half a year ago, I organized all my deep learning-related videos in a handy blog post to have everything in one place.

Since many people liked this post, and because I like to use my winter break to get organized, I thought I could free two birds with one key by compiling this list below.

Here, you find a list of approximately 90 machine learning lectures I recorded in 2020 and 2021! Once again, I hope this is useful to you!

PS: Of course, all code examples are in Python :)

Part 1: Introduction

L01 - Course overview, introduction to machine learning

Videos Material
1 ๐ŸŽฅ 1.1 Course overview (30:41) ๐Ÿ“ Slides

๐Ÿ“ Notes
2 ๐ŸŽฅ 1.2 What is Machine Learning (20:13)
3 ๐ŸŽฅ 1.3 Categories of Machine Learning (15:08)
4 ๐ŸŽฅ 1.4 Notation (30:07)
5 ๐ŸŽฅ 1.5 ML applications (16x:25)
6 ๐ŸŽฅ 1.6 ML motivation (33:07)

L02 - Introduction to Supervised Learning and k-Nearest Neighbors Classifiers

Videos Material
1 ๐ŸŽฅ 2.1 Introduction to NN (21:00) ๐Ÿ“ Slides

๐Ÿ“ Notes
2 ๐ŸŽฅ 2.2 Nearest neighbor decision boundary (25:40)
3 ๐ŸŽฅ 2.3 K-nearest neighbors (14:13)
4 ๐ŸŽฅ 2.4 Big O of K-nearest neighbors (38:23)
5 ๐ŸŽฅ 2.5 Improving k-nearest neighbors (26:52)
6 ๐ŸŽฅ 2.6 K-nearest neighbors in Python (50:12) ๐ŸŽฎ 02_knn_demo.ipynb

Part 2: Computational foundations

L03 - Using Python

Videos Material
1 ๐ŸŽฅ 3.1 (Optional) Python overview (22:57) ๐Ÿ“ Notes
2 ๐ŸŽฅ 3.2 (Optional) Python setup (19:21)
3 ๐ŸŽฅ 3.3 (Optional) Running Python code (32:00)

L04 - Introduction to Pythonโ€™s scientific computing stack

Videos Material
1 ๐ŸŽฅ 4.1 Intro to NumPy (31:42) ๐ŸŽฎ 04_scipython__code.ipynb
2 ๐ŸŽฅ 4.2 NumPy Array Construction and Indexing (16:09)
3 ๐ŸŽฅ 4.3 NumPy Array Math and Universal Functions (24:55)
4 ๐ŸŽฅ 4.4 NumPy Broadcasting (4:38)
5 ๐ŸŽฅ 4.5 NumPy Advanced Indexing โ€“ Memory Views & Copies (15:15)
6 ๐ŸŽฅ 4.6 NumPy Random Number Generators (12:39)
7 ๐ŸŽฅ 4.7 Reshaping NumPy Arrays (10:45)
8 ๐ŸŽฅ 4.8 NumPy Comparison Operators and Masks (9:13)
9 ๐ŸŽฅ 4.9 NumPy Linear Algebra Basics (11:46)
10 ๐ŸŽฅ 4.10 Matplotlib (19:47)

L05 - Data preprocessing and machine learning with scikit-learn

Videos Material
1 ๐ŸŽฅ 5.1 Reading a Dataset from a Tabular Text File (24:11) ๐Ÿ“ Slides

๐ŸŽฎ 05-preprocessing-and-sklearn__notes.ipynb
2 ๐ŸŽฅ 5.2 Basic data handling (30:27)
3 ๐ŸŽฅ 5.3 Object Oriented Programming & Python Classes (21:47)
4 ๐ŸŽฅ 5.4 Intro to Scikit-learn (12:19)
5 ๐ŸŽฅ 5.5 Scikit-learn Transformer API (47:01)
6 ๐ŸŽฅ 5.6 Scikit-learn Pipelines (26:16)

Part 3: Tree-based methods

L06 - Decision trees

Videos Material
1 ๐ŸŽฅ 6.1 Intro to Decision Trees (25:04) ๐Ÿ“ Slides

๐Ÿ“ Notes
2 ๐ŸŽฅ 6.2 Recursive algorithms & Big-O (38:19)
3 ๐ŸŽฅ 6.3 Types of decision trees (27:34)
4 ๐ŸŽฅ 6.4 Splitting criteria (47:53)
5 ๐ŸŽฅ 6.5 Gini & Entropy versus misclassification error (21:02)
6 ๐ŸŽฅ 6.6 Improvements & dealing with overfitting (33:11)
7 ๐ŸŽฅ 6.7 Code Example (18:44) ๐ŸŽฎ 06-trees_demo.ipynb

๐ŸŽฎ 06-trees_demo.ipynb

L07 - Ensemble methods

Videos Material
1 ๐ŸŽฅ 7.1 Intro to ensemble methods (15:06) ๐Ÿ“ Slides

๐Ÿ“ Notes

๐ŸŽฎ 07_code-from-slides.ipynb
2 ๐ŸŽฅ 7.2 Majority Voting (23:31)
3 ๐ŸŽฅ 7.3 Bagging (37:45)
4 ๐ŸŽฅ 7.4 Boosting and AdaBoost (39:39)
5 ๐ŸŽฅ 7.5 Gradient Boosting (1:04:04)
6 ๐ŸŽฅ 7.6 Random Forests (32:28)
7 ๐ŸŽฅ 7.7 Stacking (34:12)

Part 4: Model evaluation

L08 - Model evaluation 1 โ€“ overfitting

Videos Material
1 ๐ŸŽฅ 8.1 Intro to overfitting and underfitting (21:16) ๐Ÿ“ Slides

๐Ÿ“ Notes
2 ๐ŸŽฅ 8.2 Intuition behind bias and variance (15:34)
3 ๐ŸŽฅ 8.3 Bias-Variance Decomposition of the Squared Error (30:50)
4 ๐ŸŽฅ 8.4 Bias and Variance vs Overfitting and Underfitting (7:22)
5 ๐ŸŽฅ 8.5 Bias-Variance Decomposition of the 0/1 Loss (23:21)
6 ๐ŸŽฅ 8.6 Different Uses of the Term โ€œBiasโ€ (17:47)

L09 - Model evaluation 2 โ€“ confidence intervals

Videos Material
1 ๐ŸŽฅ 9.1 Introduction (21:22) ๐Ÿ“ Slides

๐Ÿ“ Notes
2 ๐ŸŽฅ 9.2 Holdout Evaluation (28:59) ๐ŸŽฎ 09-eval2-ci__1_distribution-and-subsampling.ipynb
3 ๐ŸŽฅ 9.3 Holdout Model Selection (7:13)
4 ๐ŸŽฅ 9.4 ML Confidence Intervals via Normal Approximation (16:17)
5 ๐ŸŽฅ 9.5 Resampling and Repeated Holdout (19:27) ๐ŸŽฎ 09-eval2-ci__2_holdout-and-repeated-sampling.ipynb

๐ŸŽฎ09-eval2-ci__3_pessimistic-bias-in-holdout.ipynb
6 ๐ŸŽฅ 9.6 Bootstrap Confidence Intervals (28:32) ๐ŸŽฎ 09-eval2-ci__4-confidence-intervals_iris.ipynb

๐ŸŽฎ 09-eval2-ci__4-confidence-intervals_mnist.ipynb
7 ๐ŸŽฅ 9.7 The .632 and .632+ Bootstrap methods (29:16)

L10 - Model evaluation 3 โ€“ cross-validation and model selection

Videos Material
1 ๐ŸŽฅ 10.1 Cross-validation lecture overview (11:16) ๐Ÿ“ Slides

๐Ÿ“ Notes
2 ๐ŸŽฅ 10.2 Hyperparameters (17:50)
3 ๐ŸŽฅ 10.3 k-fold CV for model evaluation (27:40)
4 ๐ŸŽฅ 10.4 k-fold CV for model eval. code examples (21:13) ๐ŸŽฎ 10_04_kfold-eval.ipynb
5 ๐ŸŽฅ 10.5 k-fold CV for model selection (17:27)
6 ๐ŸŽฅ 10.6 k-fold CV for model evaluation code examples (25:14) ๐ŸŽฎ 10_06_kfold-sele.ipynb
7 ๐ŸŽฅ 10.7 k-fold CV 1-standard error method (12:26)
8 ๐ŸŽฅ 10.8 k-fold CV 1-standard error method code example (9:10) ๐ŸŽฎ 10_08_1stderr.ipynb

L11 - Model evaluation 4 โ€“ algorithm selection

Videos Material
1 ๐ŸŽฅ 11.1 Lecture Overview (12:37) ๐Ÿ“ Slides

๐Ÿ“ Note
2 ๐ŸŽฅ 11.2 McNemarโ€™s Test for Pairwise Classifier Comparison (20:45)
3 ๐ŸŽฅ 11.3 Multiple Pairwise Comparisons (7:12)
4 ๐ŸŽฅ 11.4 Statistical Tests for Algorithm Comparison (8:15)
5 ๐ŸŽฅ 11.5 Nested CV for Algorithm Selection (17:19)
6 ๐ŸŽฅ 11.6 Nested CV for Algorithm Selection Code Example (24:34) ๐ŸŽฎ 11-eval4-algo__nested-cv_compact.ipynb

๐ŸŽฎ 11-eval4-algo__nested-cv_verbose1.ipynb

๐ŸŽฎ 11-eval4-algo__nested-cv_verbose2.ipynb

L12 - Model evaluation 5 โ€“ evaluation and performance metrics

Videos Material
1 ๐ŸŽฅ 12.0 Lecture Overview (7:56) ๐Ÿ“ Slides
2 ๐ŸŽฅ 12.1 Confusion Matrix (28:08) ๐ŸŽฎ 12_1_confusion-matrix.ipynb
3 ๐ŸŽฅ 12.2 Precision, Recall, and F1 Score (11:47) ๐ŸŽฎ 12_2_pre-recall-f1.ipynb
4 ๐ŸŽฅ 12.3 Balanced Accuracy (9:38) ๐ŸŽฎ 12_3_balanced-acc.ipynb
5 ๐ŸŽฅ 12.4 Receiver Operating Characteristic (18:37) ๐ŸŽฎ 12_4_roc.ipynb
6 ๐ŸŽฅ 12.5 Extending Binary Metric to Multiclass Problems (21:11)

Part 5: Dimensionality reduction

L13 - Feature selection

Videos Material
1 ๐ŸŽฅ 13.0 Introduction to Feature Selection (16:09) ๐Ÿ“ Slides
2 ๐ŸŽฅ 13.1 The Different Categories of Feature Selection (11:38)
3 ๐ŸŽฅ 13.2 Filter Methods for Feature Selection โ€“ Variance Threshold (19:52) ๐ŸŽฎ 01_variance-threshold.ipynb
4 ๐ŸŽฅ 13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (23:32) ๐ŸŽฎ 02_lasso-path.ipynb
5 ๐ŸŽฅ 13.3.2 Decision Trees & Random Forest Feature Importance (39:42) ๐ŸŽฎ 03_random-forest.ipynb
6 ๐ŸŽฅ 13.4.1 Recursive Feature Elimination (28:51) ๐ŸŽฎ 04_recursive-feature-elimination.ipynb
7 ๐ŸŽฅ 13.4.2 Feature Permutation Importance (16:55)
8 ๐ŸŽฅ 13.4.3 Permutation importance code example (27:37) ๐ŸŽฎ 05_permutation-importance.ipynb

๐ŸŽฎ 06_random_feat_as_control.ipynb

๐ŸŽฎ 07_perm-imp-with-correlated-feats.ipynb

๐ŸŽฎ 08_sequential-feature-selection.ipynb
9 ๐ŸŽฅ 13.4.4 Sequential feature selection (29:59)
10 ๐ŸŽฅ 13.4.5 Sequential feature selection code example (23:35) ๐ŸŽฎ 08_sequential-feature-selection.ipynb

L14 - Feature extraction

TBD: I am planning to add more videos some time in the future when time permits. You can subscribe to my YouTube channel to get notified.

Part 6: Bayesian methods

L15 - Introduction to Bayesian methods for machine learning

TBD

L16 - Applying naive Bayes

TBD