...

/

Transformers and Transfer Learning

Transformers and Transfer Learning

Let's discuss the transformers and their impact on machine learning.

We'll cover the following...

A milestone in NLP happened in 2017 with the release of the research paper Attention Is All You NeedVaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017). Attention Is All You Need. ArXiv./abs/1706.03762 by Vaswani et al., which introduced a brand-new machine learning idea and architecture—transformers. Transformers in NLP is a fresh idea that aims to solve sequential modeling tasks and targets some problems introduced by long short-term memory (LSTM) architecture (recall LSTM architectureLSTM). Here's how the paper explains how transformers work: "The Transformer is the first transduction model relying entirely on self-attention to compute representations of its input and output without using sequence-aligned RNNs or convolution."

Transduction in this context means transforming input words to output words by transforming input words and sentences into vectors. Typically, a transformer is trained on a huge corpus such as Wiki or news. Then, in our ...