Overview
Learn about the networks with feedback and bi-directional connections.
We'll cover the following
We have discussed several directed models with a simple directed consecutive flow of information, including feed-forward neural networks and directed acyclic graphs for Bayesian networks. A more general form of models can include cyclic dependencies, where model components that receive information can influence model components from the sending nodes. There are two principle architectures we will discuss in this chapter:
- Network with feedback connections
- Network with bi-directional connections
Network with feedback connections
The first principle architecture of cyclic graphs is shown on the left in the figure below. These types of models are directed graphs similar to the Bayesian networks discussed in Chapter 8, except that we now consider possible loops in the directed graph. We will discuss such models in the context of recurrent neural networks where the network nodes are model neurons similar to those used in neural networks, but with feedback connections.
We consider such recurrent neural networks in their common setting with deterministic neurons, although it is possible to generalize the architectures to the stochastic case. However, even with deterministic neurons, such architectures can change neuron activations in an ongoing way, even with constant input. Formally, such networks represent dynamical systems in the wider context and, therefore do represent some form of temporal modeling. Thus, the topic of temporal modeling will be at the center of our discussions.
Note: The network in the following figure has two input neurons and two output neurons. Such neurons are commonly called visible neurons. In contrast, the neurons that are not connected to the outside world are called hidden neurons.
Get hands-on with 1200+ tech skills courses.