Understanding Gradient Descent
As you know I am currently doing fast.ai course. And I decided to take my time doing Lesson 3 because it’s about a very important topic: “Gradient Descent”.
I’ve done many ML courses by the past (Andrew Ng’s Course, Aurelien Geron’s book, etc…) and each of these helped me understand what gradient descent was to a certain extent. But now I can confidently say that I’ve finally grasped what Gradient Descent is truly about.
You too can achieve the same understanding by following the steps I did:
Watch the lesson 3 video and follow with Jeremy
Read Chapter 4 of the fastai book
Watch Andrej Karpathy’s amazing video: The spelled-out intro to neural networks and backpropagation: building micrograd
I can’t stress enough how amazing Andrej’s video is!!! It’s a perfect complementary material to fast.ai’s lesson 3, and he goes step by step on how to build micrograd which is an autograd engine.
Don’t forget to write the code along with both Jeremy and Andrej.
Do this and you will have understood how Gradient Descent works and you will know how to implement it too.