Introduction: Basics of Transformers

Get to know the popular artificial intelligence platforms and get an overview of this chapter.

Transformers are industrialized, homogenized post-deep learning models designed for parallel computing on supercomputers. Through homogenization, one transformer model can carry out a wide range of tasks with no fine-tuning. Transformers can perform self-supervised learning on billions of records of raw, unlabeled data with billions of parameters.

Fourth Industrial Revolution

These particular architectures of post-deep learning are called foundation models. Foundation model transformers represent the epitome of the Fourth Industrial Revolution, which began in 2015 with machine-to-machine automation that connected everything to everything. Artificial intelligence in general, and specifically Natural Language Processing (NLP), has gone far beyond the software practices of the past in the aftermath of Industry 4.0 (I4.0).

Get hands-on with 1400+ tech skills courses.