Lectures
You can download the lectures here. We will try to upload the annotated classroom slides after the corresponding classes.
-
-
(foml-01) ML and Learning Paradigms
tl;dr: What is Machine Learning and different learning paradigms.
[slides]
Suggested Reading
-
(foml-02) Probability Refresher - 1
tl;dr: Probability, sum rule, product rule, random variable, distribution, marginal, conditional, Bayes rule.
[slides] [annotated-slides]
Suggested Reading
-
(foml-03) Probability Refresher - 2
tl;dr: Expectation, Variance of a random variable and the Gausssian distribution.
[slides] [annotated-slides]
Suggested Reading
-
(foml-04) MLE
tl;dr: Maximum likelihood principle - estimating the most likely explanation of the data.
[slides] [annotated-slides]
Suggested Reading
-
(foml-05) MAP and Fully Bayesian Treatment
tl;dr: Maximum likelihood principle - estimating the most likely explanation of the data.
[slides] [annotated-slides]
Suggested Reading
-
(foml-06) Linear Regression with Basis functions
tl;dr: First supervised learning model for regression - weighted sum of basis functions.
[slides] [annotated-slides]
Suggested Reading
-
(foml-07) Geometric Interpretation for Least Squares
tl;dr: Project of targets onto the space spanned by the basis functions.
[slides] [annotated-slides]
Suggested Reading
-
(foml-08) SGD
tl;dr: Sequential learning of the parameters via the gradient of the loss funciton.
Suggested Reading
-
(foml-09) Regularized Least Sqaures
tl;dr: How to address the overfitting?
[slides] [annotated-slides]
Suggested Reading
-
(foml-10) Bias Variance Decomposition
tl;dr: Breaking down a model's prediction error into components.
[slides] [annotated-slides]
Suggested Reading
-
(foml-11) Bayesian Regression
tl;dr: Model averaging in the parameter space.
[slides] [annotated-slides]
Suggested Reading
-
(foml-12) Decision Theory
tl;dr: Strategies to build classifiers.
[slides] [annotated-slides]
Suggested Reading
-
(foml-13) Probabilistic Generative Models
tl;dr: Models that learn the underlying data distribution, not just classify data.
[slides] [annotated-slides]
Suggested Reading
-
(foml-14) Probabilistic Generative Models - discrete features
tl;dr: Models that learn the underlying data distribution, not just classify data.
[slides] [annotated-slides]
Suggested Reading
-
(foml-15) Discriminant Functions
tl;dr: Linear Classification models that directly map input to target.
[slides] [annotated-slides]
Suggested Reading
-
(foml-16) Discriminative Models - Least Squares Regression
tl;dr: Least Squares for Classification.
[slides] [annotated-slides]
Suggested Reading
-
(foml-17) Discriminative Models - The Perceptron
tl;dr: An ad hoc learning strategy of adding/subtracting the samples to learn the optimal parameters of a linear discriminant.
[slides] [annotated-slides]
Suggested Reading
-
(foml-18) Classification with Basis Functions
tl;dr: Basis functions can help linearly separating the data.
[slides] [annotated-slides]
Suggested Reading
-
(foml-19) Probabilistic Discriminative Models - Logistic Regression
tl;dr: Parameter efficient probabilistic model for classification.
[slides] [annotated-slides]
Suggested Reading
-
(foml-20) Logistic Regression - SGD
tl;dr: Gradient descent to learn the parameters of logistic regression.
[slides] [annotated-slides]
Suggested Reading
-
(foml-21) Logistic Regression - Newton Raphson Optimization (IRLS)
tl;dr: A second-order optimization method to learn the parameters of logistic regression; Iterative Re-weighted Least Sqaures.
[slides] [annotated-slides]
Suggested Reading
-
(foml-22) Neural Networks - I
tl;dr: From fixed basis functions to learning them.
[slides] [annotated-slides]
Suggested Reading
-
(foml-23) Neural Networks - II
tl;dr: UAT; NN configurations and losses for different tasks.
[slides] [annotated-slides]
Suggested Reading
-
(foml-24) Neural Networks - III
tl;dr: Gradient Descent Applied to NNs is Backpropagation
[slides] [annotated-slides]
Suggested Reading
-
(foml-25) Unsupervised Learning - 1
tl;dr: Overview of Unsupervised Learning with a focus on K-means clustering.
[slides]
Suggested Reading
- Chapter 14 of Introduction to Statistical Learning textbook by Gareth James et al.
- Chapter 22 of Understanding ML: From Theory to Algorithms book by Shai Shalev-Shwartz et al.
- Chapter 9 of PR and ML book by Christopher M Bishop
- Chapter 7 of Introduction to Machine Learning by Ethem Alpaydın
- Chapter 25 of Machine Learning: a Probabilistic Perspective by Kevin Murphy