Normalisation
Learn about normalisation: what it is and why it is necessary?
We'll cover the following
Why normalisation?
The weights in a neural network, and the signals that pass through a network, can have potentially large values. We’ve already seen how this can lead to saturation, which can make learning harder.
A lot of research has been done on the benefits of reducing the range of parameter and signal values in a neural network, and also shifting the values so the mean is zero. This is called normalisation.
One simple application of this idea is to ensure that signals that pass into a neural network layer are already normalised.
Normalisation in PyTorch
Let’s revert the code from BCELoss
back to MSELoss
, sigmoid activation function, and the SGD
optimiser, and then use LayerNorm(200)
to normalise the network signals just before they enter the final layer.
Get hands-on with 1400+ tech skills courses.