...

/

Backpropagation: Splitting the Error

Backpropagation: Splitting the Error

Discover how errors propagate back in the network.

The output error backpropagation

Neural networks learn by refining their link weights. Therefore, to backpropagate the error to internal nodes, we split the output layer errors in proportion to the size of the connected link weights, and then recombine these bits at each internal node. The following diagram shows a simple neural network with three layers: an input layer, a hidden layer, and the final output layer.

Press + to interact
Error in the output layer
Error in the output layer

Working back from the final output layer at the right-hand side, we can see that we use the errors in that output layer to guide the refinement of the link weights feeding into the final layer. We’ve labeled the output errors more generically as eoutpute_\text{output} ...

Access this course and 1400+ top-rated courses and projects.