Now let’s use T5 to summarize documents.

NLP summarizing tasks extract succinct parts of a text. This section will start by presenting the Hugging Face resources we will use. Then we will initialize a T5-large transformer model. Finally, we will see how to use T5 to summarize any document, including legal and corporate documents.

Let’s begin by introducing Hugging Face’s framework.

Hugging Face

Hugging Face designed a framework to implement transformers at a higher level.

To expand our knowledge, we needed to explore other approaches, such as Trax and OpenAI’s models. This section will use Hugging Face’s framework again and explain more about the online resources. We will end the section by using the unique potential of a GPT-3 engine.

Hugging Face provides three primary resources within its framework: models, datasets, and metrics.

Hugging Face transformer resources

Here, we will choose the T5 model that we will be implementing in this section.

A wide range of models can be found on the Hugging Face models page, as we can see below:

Get hands-on with 1200+ tech skills courses.