Summary: Methods of Text Generation
Get a quick recap of the major learning points in this chapter.
Congratulations on completing a complex chapter involving a large number of concepts. In this chapter, we covered various concepts associated with handling textual data for the task of text generation. We started by developing an understanding of different text representation models. We covered most of the widely used representation models, from Bag of Words to word2vec and even FastText.
The next section of the chapter focused on developing an understanding of RNN-based text generation models. We briefly discussed what comprises a language model and how we can prepare a dataset for such a task. We then trained a character-based language model to generate synthetic text samples. We touched upon different decoding strategies and used them to understand different outputs from our RNN based-language model. We also delved into a few variants, such as stacked LSTMs and bidirectional LSTM-based language models. Finally, we discussed the usage of convolutional networks in the NLP space.
Get hands-on with 1400+ tech skills courses.