Unrestricted LSTM with Dropouts and Backward Input
Learn to enhance LSTM models with unrestricted sequences, dropout variations, and reverse processing for improved time series predictions.
We'll cover the following
Unrestricted LSTM network
There are two choices in LSTM networks: return sequences vs. return last output at the final LSTM layer. So far, we’re working with the default choice used in the baseline restricted model above. In this lesson, we’ll work on how we can improve the model.
The former choice enables the ultimate LSTM layer to emit a sequence of hidden outputs. Since the last LSTM layer is not restricted to emitting only the final hidden output, this network is called an “unrestricted” LSTM network. A potential benefit of this network is the presence of more intermediate features. Based on this hypothesis, an unrestricted network is constructed in the code below.
Get hands-on with 1400+ tech skills courses.