Forward and Backward Propagation in RNNs
In this lesson, we will discuss how the forward and backward weight updates take place in an RNN.
We'll cover the following
Forward propagation
This is the process of calculating and storing the intermediate variables (including outputs) for the neural network in order from the input layer to the output layer. We will not discuss this in too much detail, as it is assumed you already have some background knowledge on this.
A simple architecture of an RNN is shown below:
Get hands-on with 1400+ tech skills courses.