๐Ÿ€ Challenge: Backpropagation - 3 Layered Neural Network

As a challenge, code the backpropagation pass for a 3 layered neural network.

Problem statement

Code the backpropagation operation for the three layered neural network to compute the gradient of the cross-entropy loss function with respect to weights and biases of the neural network.

The cross entropy loss function is as follows:

L(y,s)=โˆ‘i=1cyiโ‹…log(si)\mathcal{L}(y, s) = \sum_{i=1}^{c} y_i \cdot log(s_i)

Where, yy is the target output, ss is the predicted output, and cc is the number of classes.

Sample input

  • The target output: y
  • The predicted output: out_y
  • The output at hidden layer 1: out_h1
  • The output at hidden layer 2: out_h2
  • The weights on the connections between hidden layer 2 and the output layer: w3
  • The weights on the connections between hidden layer 1 and the hidden layer 2: w2
  • The input values: x

Sample output

The change of weights and bias at the respective layers. For example:

  • The gradient of loss w.r.t weights at layer 3: dW3
  • The gradient of loss w.r.t bias at layer 3: db3
  • The gradient of loss w.r.t weights at layer 2: dW2
  • The gradient of loss w.r.t bias at layer 2: db2
  • The gradient of loss w.r.t weights at layer 1: dW1
  • The gradient of loss w.r.t bias at layer 1: db1

Coding exercise

Write your code below. It is recommendedโ€‹ to solve the exercise before viewing the solution.

๐Ÿ“ There is a backpropagation function given in the code for testing purposes. Do not modify the function signature.

Good luck!๐Ÿคž

Get hands-on with 1400+ tech skills courses.