Search⌘ K
AI Features

Forward and Backward Propagation in RNNs

Explore the processes of forward and backward propagation in recurrent neural networks, focusing on key concepts such as state calculations, weight matrices, and gradient descent. This lesson helps you gain a clear understanding of how RNNs function internally while applying these concepts to real-world sentiment analysis projects.

We'll cover the following...

Forward propagation

This is the process of calculating and storing the intermediate variables (including outputs) for the neural network in order from the input layer to the output layer. We will not discuss this in too much detail, as it is assumed you already have some background knowledge on this.

A simple architecture of an RNN is shown below:

Explanation:

  • a0a^{0} denotes the initial state of the network.
  • aMa^{M}, where M varies from 1 to N (number of cells), denotes the state of the previous cell’s output. This also denotes the information that has been read so far.
  • xMx^{M}, where M varies from 1 to N (number of input words), denotes the MthM^{th} input at the MthM^{th} time step.
  • YMY^{M}, where M varies from 1 to N (number of cells), denotes the output of the MthM^{th} RNN cell.

To get a better understanding, let’s look at a single RNN cell as shown below.

Explanation:

  • ata^{t}
...