Lectures
You can download the lectures here. We will try to upload lectures prior to their corresponding classes.
-
-
(dl-01) Artificial Neuron Models (MP neuron and Perceptron)
tl;dr: The Basic building block of ANNs.
[perceptron-learning-code] [slides]
Suggested Readings:
-
(dl-02) Network of perceptrons
tl;dr: If not one, can multiple (network) of Perceptrons do the job?.
[slides]
Suggested Readings:
-
(dl-03) Representational Power of an MLP
tl;dr: Multi Layered Network of Neurons can represent any artbirary function!
[ReLU-activation-code] [sigmoid-activation-codes] [slides]
-
-
(dl-05) Backpropagation-1
tl;dr: Gradient Descent applied to a simple computational graph
[slides]
Suggested Reading
Additional Readings:
-
(dl-06) Backpropagation-2
tl;dr: Gradient Descent applied to a generic computational graph (DAG)
[slides]
Suggested Reading
Additional Readings:
-
(dl-07) Cross-Entropy Loss
tl;dr: The loss function for training the NN classifiers
[slides]
Suggested Readings:
-
(dl-08) Training DNNs - I
tl;dr: Issues with Gradient Descent, specifically with the learning rate
[codes - Optimizing 1D Convex function] [codes - Optimizing 2D Quadratic function] [slides]
-
-
(dl-10) Building Blocks of CNNs
tl;dr: Basic constituting elements of Convolutional Neural Networks (CNN)
[slides] [1D-conv-backprop] [2D-conv-backprop]
Suggested Readings:
-
(dl-11) Evolution of CNN Architectures
tl;dr: How different families of architectures evolved over time.
[slides]
Suggested Readings:
-
(dl-12) Recurrent Neural Networks
tl;dr: Deep Learning Models to handle Sequence prediction tasks
[slides]
-
(dl-13) Word Embeddings
tl;dr: Representing the words as vectors for learning to perfrom language tasks
[slides]
Suggested Readings:
-
(dl-14) Encoder-Decoder Models and Attention
tl;dr: Configuration of a NN pair towards sequence learning problems and learning to selectively focus
[slides]
-
(dl-15) Self-Attention and Transformers
tl;dr: Attention is all we need!
[slides-I] [slides-II] [GIF-1 (decoding)] [GIF-2 (decoding)]
-
(dl-16) Applications of Transformers
tl;dr: Different configurations of Transformers that realize a variety of applications.
[slides]
Suggested Readings:
-
-
(dl-18) Generative Models
tl;dr: Tools to model the density over the space of data.
[slides]
Suggested Readings:
-
(dl-19) Variational Autoencoders
tl;dr: Autoencoders but also generative models!
[slides]
Suggested Readings:
-
(dl-20) Generative Adversarial Networks (GAN)
tl;dr: Pair of NNs that compete adversarially to realize an implicit density modelling!
[slides]
Suggested Readings:
-
(dl-21) Diffusion Models
tl;dr: Process of successively destroying and recovering the samples
[slides]
Suggested Readings: