Optimizer
Learn about the different optimizers in PyTorch and some of its built-in methods.
We'll cover the following
Introduction to optimizers
So far, we have been manually updating the parameters using the computed gradients. That is probably fine for two parameters, but what if we had a whole lot of them? We need to use one of PyTorch’s optimizers like SGD, RMSprop, or Adam.
Get hands-on with 1400+ tech skills courses.