Backpropagation is a technique used to optimize neural networks such as the multi-layered perceptron.
It refers to the process of sending back the outputs as inputs through the network. These inputs are then used to iteratively readjust the weights of each edge in the network so that eventually, the error between the actual and expected output is minimized.
In this way, the output of the current iteration affects the next output until the correct output is produced. The weights at the end of the process would be the ones on which the neural network works correctly.