Backpropagation: Splitting the Error
Discover how errors propagate back in the network.
The output error backpropagation
Neural networks learn by refining their link weights. Therefore, to backpropagate the error to internal nodes, we split the output layer errors in proportion to the size of the connected link weights, and then recombine these bits at each internal node. The following diagram shows a simple neural network with three layers: an input layer, a hidden layer, and the final output layer.
Get hands-on with 1400+ tech skills courses.