Introduction to autograd

Autograd is PyTorch’s automatic differentiation package. Thanks to it, we do not need to worry about partial derivatives, chain rules, or anything like it.

The backward method

So, how do we tell PyTorch to do its thing and compute all gradients? That is the role of the backward() method. It will compute gradients for all (requiring gradient) tensors involved in the computation of a given variable.

Do you remember the starting point for computing the gradients? It was the loss, as we computed its partial derivatives w.r.t. our parameters. Hence, we need to invoke the backward() method from the corresponding Python variable: loss.backward().

Get hands-on with 1400+ tech skills courses.