...
/Unrestricted LSTM with Dropouts and Backward Input
Unrestricted LSTM with Dropouts and Backward Input
Learn to enhance LSTM models with unrestricted sequences, dropout variations, and reverse processing for improved time series predictions.
We'll cover the following...
Unrestricted LSTM network
There are two choices in LSTM networks: return sequences vs. return last output at the final LSTM layer. So far, we’re working with the default choice used in the baseline restricted model above. In this lesson, we’ll work on how we can improve the model.
The former choice enables the ultimate LSTM layer to emit a sequence of hidden outputs. Since the last LSTM layer is not restricted to emitting only the final hidden output, this network is called an “unrestricted” LSTM network. A potential benefit of this network is the presence of more intermediate features. Based on this hypothesis, an unrestricted network is constructed in the code below.
<center> <img src="/score.png" height="240"/> <br> </br> <img src="/score1.png" height="240"/> <br> </br> <img src="/score2.png" height="240"/> </center>
In the construction, the ultimate LSTM layer lstm_layer_2
is set with:
return_sequences=True
The model summary in the illustration below is showing its effect. ...