Optimizer
Learn about the different optimizers in PyTorch and some of its built-in methods.
We'll cover the following...
Introduction to optimizers
So far, we have been manually updating the parameters using the computed gradients. That is probably fine for two parameters, but what if we had a whole lot of them? We need to use one of PyTorch’s optimizers like SGD, RMSprop, or Adam.
There are many optimizers; SGD is the most basic of them, and Adam is one of the most pop ...