Week 2
Lecture part A
We start by understanding what parametrised models are and then discuss what a loss function is. We then look at Gradient-based methods and how it’s used in the backpropagation algorithm in a traditional neural network. We conclude this section by learning how to implement a neural network in PyTorch followed by a discussion on a more generalized form of backpropagation.
Lecture part B
We begin with a concrete example of backpropagation and discuss the dimensions of Jacobian matrices. We then look at various basic neural net modules and compute their gradients, followed by a brief discussion on softmax and logsoftmax. The other topic of discussion in this part is Practical Tricks for backpropagation.
Practicum
We give a brief introduction to supervised learning using artificial neural networks. We expound on the problem formulation and conventions of data used to train these networks. We also discuss how to train a neural network for multi class classification, and how to perform inference once the network is trained.