Overview: spaCy and Transformers

Let's look at what we will be learning in this section.

In this chapter, we will learn about the latest hot topic in NLP, transformers, and how to use them with TensorFlow and spaCy.

First, we will learn about transformers and transfer learning. Second, we'll learn about the architectural details of the commonly used Transformer architecture—Bidirectional Encoder Representations from Transformers (BERT). We'll also learn how BERT Tokenizer and WordPiece algorithms work. Then we will learn how to quickly get started with pre-trained transformer models of the HuggingFace library. Next, we'll practice how to fine-tune HuggingFace Transformers with TensorFlow and Keras. Finally, we'll learn how spaCy v3.0 integrates transformer models as pre-trained pipelines.

Get hands-on with 1400+ tech skills courses.