Varieties of Networks: Convolution and Recursive

Learn about the history of improvements in neural networks and the alternative deep models.

Until now, we’ve primarily discussed the basics of neural networks by referencing feedforward networks, where every input is connected to every output in each layer. While these feedforward networks are useful for illustrating how deep networks are trained, they are only one class of a broader set of architectures used in modern applications, including generative models.

Networks for seeing: Convolutional architectures

One of the inspirations for deep neural network models is the biological nervous system. As researchers attempted to design computer vision systems that would mimic the functioning of the visual system, they turned to the architecture of the retina, as revealed by physiological studies by neurobiologists David Huber and Torsten Weisel in the 1960shttp://charlesfrye.github.io/FoundationalNeuroscience/img/ corticalLayers.gif. As previously described, the physiologist Santiago Ramon Y Cajal provided visual evidence that neural structures such as the retina are arranged in vertical networksWolfe, Kluender, Levy (2009). Sensation and Perception. Sunderland: Sinauer Associates Inc...

Get hands-on with 1400+ tech skills courses.