Understanding LSTM Activations and Stabilized Gradients
Explore the key role of activations in LSTMs, how they influence the network's ability to process, and how they remember information over time.
We'll cover the following
Activations in LSTM
The activations below:
for and
for emitting correspond to the activation
argument in an LSTM
layer in TensorFlow. By default, it is tanh
. These expressions act as learned features and, therefore, can take any value. With tanh
activation, they are in . Other suitable activations can also be used for them.
On the other hand, the activations for input, output, and forget gates are referred to as the argument recurrent_activation
in TensorFlow. These gates act as scales. Therefore, they are intended to stay in . Their default is, hence, sigmoid
. For most purposes, it’s essential to keep recurrent_activation
as sigmoid
.
Note: The
recurrent_activation
should besigmoid
. The default activation istanh
but can be set to other activations such asrelu
.
Parameters
Suppose an LSTM layer has cells, that is, the layer size equal to . The cell mechanism is for one cell in an LSTM layer. The parameters involved in the cell are, , where is and .
A cell intakes the prior output of all the other sibling cells in the layer. Given the layer size is , the prior output from the layer cells will be an -vector and, therefore, the are also of the same length .
The weight for the input time-step is a -vector given there are features, that is, . Lastly, the bias on a cell is a scalar.
Combining them for each of the total number of parameters in a cell is .
In the LSTM layer, there are cells. Therefore, the total number of parameters in a layer are:
Note: The number of parameters is independent of the number of time-steps the cell processes. That is, they’re independent of the window size .
This implies that the parameter space doesn’t increase if the window size is expanded to learn longer-term temporal patterns. While this might appear an advantage, in practice, the performance deteriorates after a certain limit on the window size.
An LSTM layer has parameters, where is the size of the layer and the number of features in the input.
The number of LSTM parameters is independent of the sample window size.
Iteration levels
A sample in LSTM is a window of time-step observations. Due to this, its iteration level shown in the illustration below goes one level further than in
Get hands-on with 1400+ tech skills courses.