...

/

Unraveling the Basics of Neural Networks

Unraveling the Basics of Neural Networks

Explore the progression of perceptrons to MLPs, their architecture, and deep learning fundamentals.

Perceptrons were built in the 1950s, and they proved to be a powerful classifier at the time. A few decades later, researchers realized stacking multiple perceptrons could be more powerful. That turned out to be true, and a multi-layer perceptron (MLP) was born.

A single perceptron works like a neuron in a human brain. It takes multiple inputs, and, like a neuron emits an electric pulse, a perceptron emits a binary pulse which is treated as a response.

The “neuron-like” behavior of perceptrons and an MLP being a “network” of perceptrons perhaps led to the term neural networks coming forth in the early days.

Since their creation, neural networks have ...