Introduction to Deep Learning
This set of videos will provide you all the basic tools necessary
-
Foundations of Deep Learning
-
Introduction to PyTorch
This is an exhaustive coverage of some of the core ideas of Neural Networks, a little bit of the math that motivates them, and the main capabilities of PyTorch
-
Datasets and DataLoaders
One key part of deep learning is taking data from your hard-drive and passing it to the model. To do this we will learn about PyTorch Datasets and Dataloaders!
-
Transfer Learning
The key ability of Neural Networks is to be able to train it on one large dataset and then finetune to solve a different problem. We will explore how this works by performing Transfer Learning.
-
Convolutional Neural Networks
We explore a foundational building block of Neural Networks: Convolutions. We first explore the purpose of a convolution in Signal Processing, and then move on to seeing its applications for Image Classification in Neural Networks.
-
Going Deeper with Residual Connections
Super Deep Neural Networks are untrainable due to vanishing/exploding gradients from backpropagation. We explore residual connections as a solution for this and implement the ResNet model.
-
Sequence Classification
We shift gears now to solving sequence problems! For this we will be learning about the basics of Recurrent Neural Networks (RNN/LSTM) for sequence classification.
-
Sequence Generation
We have seen previously that RNNs can classify sequences using the Many to One format. But we can also generate data, which is the Many to Many format. Today we take a look and see if we can train an RNN to generate Harry Potter.
-
Training with Multiple GPUs
To train big models requires Multi-GPU parallelism. We will be exploring how we can leverage Huggingface Accelerate to train with a Distributed Data Parallel Pipeline.
-