Optimizer
Learn about the different optimizers in PyTorch and some of its built-in methods.
We'll cover the following...
Introduction to optimizers
So far, we have been manually updating the parameters using the computed gradients. That is probably fine for two parameters, but what if we had a whole lot of them? We need to use one of PyTorch’s optimizers like SGD, RMSprop, or Adam.
Access this course and 1400+ top-rated courses and projects.