Text Generation
Learn to perform text generation using Hugging Face.
We'll cover the following
OpenAI introduced the Generative Pre-trained Transformers (GPT) models in 2018. These models provide unsupervised pretraining, which enables us to leverage heaps of text on the internet without spending our resources on annotations. GPT was succeeded by GPT-2 in 2019 with 1.5 billion parameters. GPT-3 is the latest model in the GPT family. It has 175 billion parameters and allows us to develop excellent applications.
While there are a lot of tasks available in the Generative Pre-trained Transformers (GPT) family of models, its tendency to generate long text from a brief preamble is unparalleled.
Create a free account to view this lesson.
By signing up, you agree to Educative's Terms of Service and Privacy Policy