Step 4 - Update the Parameters
Learn about how you can use the gradients and the learning rate to update the parameters.
We'll cover the following...
Updating parameters
In the final step, we use the gradients to update the parameters. Since we are trying to minimize our losses, we reverse the sign of the gradient for the update.
There is still another hyperparameter to consider: the learning rate, denoted by the Greek letter eta (that looks like the letter n
). This presents the multiplicative factor that we need to apply to the gradient for the parameter update. Our equation now becomes the following:
...