Perceptrons: A Brain in a Function

Learn about the inspiration behind what is now the field of deep learning and generative AI.

The simplest neural network architecture—the perceptron was inspired by biological research to understand the basis of mental processing in an attempt to represent the function of the brain with mathematical formulae.

From tissues to TLUs

The recent popularity of AI algorithms might give the false impression that this field is new. Many recent models are based on discoveries made decades ago that have been reinvigorated by the massive computational resources available in the cloud and customized hardware for parallel matrix computations, such as Graphical Processing Units (GPUs), Tensor Processing Units (TPUs), and Field-Programmable Gate Array (FPGAs).

If we consider research on neural networks to include their biological inspiration as well as computational theory, this field is over a hundred years old. Indeed, one of the first neural networks described appears in the detailed anatomical illustrations of 19th-century scientist Santiago Ramón y Cajal.López-Muñoz F., Boya J., Alamo C. (2006). Neuron theory, the cornerstone of neuroscience, on the centenary of the Nobel Prize award to Santiago Ramón y Cajal. Brain Research Bulletin. 70 (4–6): 391–405. https://pubmed.ncbi.nlm.nih. gov/17027775/ illustrations, based on experimental observations of layers of interconnected neuronal cells, inspired the Neuron Doctrine—the idea that the brain is composed of individual, physically distinct, and specialized cells rather than a single continuous networkRamón y Cajal, Santiago (1888). Estructura de los centros nerviosos de las aves. . The distinct layers of the retina observed by Cajal were also the inspiration for particular neural network architectures such as CNN, which we’ll discuss later.

Get hands-on with 1400+ tech skills courses.