What is GPT-3?

GPT-3 (Generative Pre-trained Transformers, version 3) is a transformer-based pre-trained AI model for natural language processing, generation, and understanding, etc. GPT-3 is a successor to GPT-2 (which in turn was a successor to GPT itself) and is a hallmark of huge pre-trained deep learning models at 175 billion parameters. As a comparison, GPT-2 had only 1.5 billion parameters.

Introduced in 2020 by OpenAI, GPT-3 has left the AI community with split opinions. Many think of it as the most exciting model in AI ever, and then there are few who are worried about the biasedness and its black-box nature. Nevertheless, there is almost a unanimous opinion around the AI community about the significance of this model, either way. As a matter of fact, GPT-3 won the best paper award in NeurIPS 2020.

Some applications

Some of the cool applications of GPT-3 include:

  • Codex: An app which is sending shivers down programmers’ spines writes code with minimal plain English instructions.
  • Text generation: OpenAI provides a Beta interface where we provide GPT-3 with snippets and it generates more text for completion.

Then there are some which aren’t made by OpenAI directly and use the API interface.

  • AI Writer: This app fine-tunes GPT-3 on writing styles of some famous personalities and emulates them replying to our emails.
  • AI Dungeon: This is an adventure text-based game. I remember the Eliza chatbot in the early 2000’s, and it was so much fun for us technology newbies. It could never occur to me that it would be topped in such a way within a couple of decades.
  • Podacity Ai: Podacity Ai is a Spotify connected tool for finding high-quality educational podcasts.

GPT-3 Quick Facts:

Generation model

GPT-3 is a generation model, which aims to translate a given source sentence into a target sentence.

Language

In principle, GPT-3 can be applied to any language. In practice, it is already being used for English, Chinese, Japanese, Korean, Hindi, German, French, Russian, and Spanish.

Usage

GPT-3 is being used in a wide variety of applications, including translating a given source language into a target language, generating a text based on a given source text, extracting a target sentence from a given source sentence, etc.

GPT-3 has been used to write some parts of this shot as well

Playground

Just for fun, I entered the first two paragraphs of this shot in the GPT-3 playground (still in beta version) and the generated text (with default settings) was quite interesting. Quoting the latter part:

GPT-3 is one of the most significant deep learning models in the recent AI history. It is a successor to GPT-2, which was a successor to GPT itself. GPT is a deep neural network model which was the first pre-trained model to be used in the industry.

GPT-2 was a state-of-the-art model in many ways and was used in the industry and academia for a wide range of applications, including translation and generation. But, GPT-2 had one limitation: it was trained on a very small dataset which was highly biased, and as a consequence, it was not able to scale to the full-scale AI applications. To overcome this limitation, GPT-3 was created, which was trained on a larger dataset with less bias.

GPT-3 is the third generation model which is a successor to GPT-2. GPT-3 is a neural network model which has been pre-trained on a much larger dataset with less bias. The reason to make a successor to GPT-2 is quite clear −-− GPT-2 was very successful in the industry and academia, but […]

And as you can see, it would apparently go on forever and seems to have no limits whatsoever. Frankly, I was startled by the text quality, and at times it is really hard to distinguish between a human and GPT-3’s output.

Writing like Tolstoy

Here comes the fun part as I tried to get into the mind (or drawing room) of the Rostovs family and tried to extend the dialogue from my favourite novel (Vol II). The text below is generated by Tolstoy 2.0:


You know, I sometimes feel very, very sad,” Natasha went on. “All of a sudden, when I’m in a crowd, or when I’m alone, I feel sad.”

Yes, I know that feeling,” said Nikolai.

And do you know what I dreamt last night?Natasha went on.

As soon as we came home, I lay down on my bed and fell asleep. I dreamt I was flying, flying, and I flew up and up, and suddenly I flew right into heaven, and there I saw Papa and Mamma, and you were there too, and we were all so happy. And we were there for a long, long time. And all of a sudden we heard the bells ringing for service, and we all went to church, and we all knelt down to pray, and I saw the wings of the cherubim and the shine of the icon-lamps and the candles, and we were still praying, and I don’t know how, but we all turned into angels. And we were flying and flying and flying, and we were happy. And we were there a long time. And then we heard the bells ringing for vespers, and we all flew away. And I woke up and opened my eyes, and it was dark, and I was alone. I was so sorry. I wanted to cry, but I couldn’t.”

That’s a wonderful dream,” said Nikolai.

Yes, but I was sorry I woke up.”

I’m very, very glad I dreamt it,” said Natasha. “I was very glad.”

You were glad?

Yes, I was glad.


I am unsure what Tolstoy’s feelings about it would be, but from the fan’s point of view, I am both happy and concerned: I can know more about Natasha’s dreams, but am also afraid of “someone” else (an AI) reading them.

Free Resources

Copyright ©2024 Educative, Inc. All rights reserved