Lectures
You can download the lectures here. We will try to upload lectures prior to their corresponding classes.
-
-
(dl-01) Artificial Neuron Models (MP neuron and Perceptron)
tl;dr: The Basic building block of ANNs.
[perceptron-learning-code] [slides]
Suggested Readings:
-
(dl-02) Network of perceptrons
tl;dr: If not one, can multiple (network) of Perceptrons do the job?.
[slides]
Suggested Readings:
-
(dl-03) Representational Power of an MLP
tl;dr: Multi Layered Network of Neurons can represent any artbirary function!
[ReLU-activation-code] [sigmoid-activation-codes] [slides]
-
-
(dl-05) Backpropagation-1
tl;dr: Gradient Descent applied to a simple computational graph
[slides]
Suggested Reading
Additional Readings:
-
(dl-06) Backpropagation-2
tl;dr: Gradient Descent applied to a generic computational graph (DAG)
[slides]
Suggested Reading
Additional Readings:
-
(dl-08) Training DNNs - I
tl;dr: Issues with Gradient Descent, specifically with the learning rate
[codes - Optimizing 1D Convex function] [codes - Optimizing 2D Quadratic function] [slides]
-
-
(dl-10) Building Blocks of CNNs
tl;dr: Basic constituting elements of Convolutional Neural Networks (CNN)
[slides] [1D-conv-backprop] [2D-conv-backprop]
-
(dl-11) Evolution of CNN Architectures
tl;dr: How different families of architectures evolved over time.
[slides]
-
(dl-11a) Visualizing and Understanding CNNs
tl;dr: What do the CNNs learn? Why the predict what they predict?
[slides]
Suggested Readings:
- Rich Feature Hierarchies by Girshick et al. CVPR 2014
- Visualizing and Understanding CNNs by Zeiler and Fergus ECCV 2014
- Deep Inside CNNs by Simonyan et al. ICLRW 2014
- Class Activation Maps (CAM) by B Zhou et al. CVPR 2016
- Grad-CAM by Selvaraju et al. NIPSW 2016, ICCV 2017
- CNN-Fixations by Mopuri et al. TIP 2018
- Texture Synthesis using CNNs by Gatys et al. NeurIPS 2015
- DeepDream
- Neural style transfer by Gatys et al. CVPR 2016
- Understanding Deep Representations by Inverting them by A Mahendran et al. CVPR 2015
- Inverting Visual Representations with CNNs by A Dosovitskiy et al. CVPR 2016-
-
(dl-12) Recurrent Neural Networks
tl;dr: Deep Learning Models to handle Sequence prediction tasks
[slides]
-
(dl-13) Word Embeddings
tl;dr: Representing the words as vectors for learning to perfrom language tasks
[slides]
Suggested Readings: