Deep Learning / Aug-Dec 2021

Updates

  • 2021-11-26 00:00:00 +0000: New Lecture is up: (dlc-10.1) Inside DNNs [slides]
  • 2021-11-23 00:00:00 +0000: New Lecture is up: (dlc-7.4) VAE [slides]
  • 2021-11-19 00:00:00 +0000: New Lecture is up: (dlc-9.1) GANs [Simple GAN] [slides]
  • 2021-11-12 00:00:00 +0000: New Lecture is up: (dlc-8.2) Word Embeddings [slides]
  • 2021-11-05 00:00:00 +0000: New Lecture is up: (dlc-8.1) Recurrent Neural Networks [Elman Network] [slides]
  • 2021-11-02 00:00:00 +0000: New Lecture is up: (dlc-7.3) Denoising Autoencoders [slides]
  • 2021-10-29 00:00:00 +0000: New Lecture is up: (dlc-7.2) Autoencoders [slides]

Course Description

Introductory course on Deep Learning with equal focus on learning concepts and their implementation. Prerequisites are basic calculus, probablility, matrix operations, and basic python programming. Exposure to Machine Learning greatly helps to quickly grasp some of the concepts. We will use PyTorch for implementing various concepts.

Course Outcomes

These are broadly the objectives of this course and hence can be the outcomes.

1. Starting from an artificial neuron we aim to understand feed forward and recurrent architectures of Artificial Neural Networks. We visit Neurons (MP, perceptron), MLP, CNNs, and RNNs (LSTM and GRU). We will understand how to train these models (via the Gradient Descent technique using the Backpropagation algorithm).
2. Realizing these architectures in the PyTorch framework. We understand the underlying computational graph, Autograd function, batch processing of the data, etc. culminating into writing PyTorch modules.
3. We discuss concepts related to parameter initialization and optimization. Know about different important optimizers in PyTorch.
4. We emphasize on the concepts related to depth such as benefits of depth, different regularization methods to handle the complexity such as BatchNorm, DropOut, residual architectures, etc.
5. We learn about the important unsupervised model, the Autoencoder and its variations.
6. We take applications from Compute Vision to study the models and learning mechanism. We get into filter visualization, looking at activations, etc.
7. We then briefly discuss the Generative models such as GANs.


Instructor

Teaching Assistants

Bhasamrita Sarmah (cs21m004@iittp.ac.in)