Multilayer Perceptrons and Backpropagation

Learn the basics of neural networks and the backpropagation algorithm.

While large research funding for neural networks declined until the 1980s after the publication of Perceptrons, researchers still recognized that these models had value, particularly when assembled into multilayer networks, each composed of several perceptron units. Indeed, when the mathematical form of the output function (that is, the output of the model) was relaxed to take on many forms (such as a linear function or a sigmoid), these networks could solve both regression and classification problems, with theoretical results showing that three-layer networks could effectively approximate any outputCrevier, Daniel (1993), AI: The Tumultuous Search for Artificial Intelligence, New York, NY: BasicBooks.. However, none of this work addressed the practical limitations of computing the solutions to these models, with rules such as the perceptron learning algorithm described earlier proving a great limitation to their applied use.

Get hands-on with 1400+ tech skills courses.