Recurrent neural networks (RNNs)

There are several operations that feedforward neural networks weren’t able to do very well, including working with sequential data that is rooted in time, operations that need to contextualize multiple inputs (not just the current input), and operations that require memorization from previous inputs. For these reasons, the main draw of RNNs is the internal memory they possess that allows them to perform and remember the kind of robust operations required of conversational AIs such as Apple’s Siri.

RNNs do well with sequential data and place a premium on the context in order to excel at working with time-series data, DNA and genomics data, speech recognition, and speech-to-text functions. In contrast to the preceding CNN example, which works with a feedforward function, RNNs work in loops.

Get hands-on with 1400+ tech skills courses.