Introduction: Methods of Text Generation

Get an overview of the topics that will be covered in this chapter.

In the past couple of years, natural language processing (NLP) or the processing of textual data has seen great interest and research. Text is not just another unstructured type of data; there’s a lot more to it than what meets the eye. Textual data represents our thoughts, ideas, knowledge, and communication.

In this chapter, we’ll focus on understanding concepts related to NLP and generative models for textual data. We'll cover different concepts, architectures, and components associated with generative models for textual data, with a focus on the following topics in this chapter:

  • A brief overview of traditional ways of representing textual data.

  • Distributed representation methods.

  • RNN-based text generation.

  • LSTM variants and convolutions for text.

We’ll cover the internal workings of different architectures and key contributions that have enabled text generation use cases. We’ll also build and train these architectures to get a better understanding of them. Before we get into the modeling aspects, let’s get started by understanding how to represent textual data.

Get hands-on with 1400+ tech skills courses.