Backpropagation Part 2

In this lesson, we’ll continue our discussion of backpropagation.

Backpropagation worked example

In this lesson we will consider the below Neural Network and update the weights and biases for one round of backpropagation. We use the sigmoid function activation function, and the learning rate l is 0.9.

We have the following initial values of inputs, weights and biases of the above neural network.

x1,x2,x3x_1, x_2, x_3 are our input values with values 11, 00 and 11 respectively and the corresponding class label is 1 i.e tj=1t_j=1. θ4,θ5,θ6\theta_{4}, \theta_{5}, \theta_{6} are the bias values of the units 4,54, 5 and 66 respectively.

Feedforward Process

The following table sums up the feedforward process. The first column represents the unit number from the above neural network. The second column represents the net input being computed at those units and the third column applies the sigmoid activation on the relevant net inputs and produces the output.

Units, j Net input Ij Output, Oj(Applying Sigmoid Activation)
4 0.2+0-0.5-0.4=-0.7 0.332
5 -0.3+0+0.2+0.2=0.1 0.525
6 (-0.3)(0.332)-(0.2)(0.525) + 0.1=-0.105 0.474

Formulas for weights update

From the last lesson we have the following formulas and we will be utilizing them to update the weights and biases.

  • wijnew=wijold+Δwijw_{ij}^{new} = w_{ij}^{old} + \Delta w_{ij} ...