Transformers and Transfer Learning
Explore the transformer architecture introduced in 2017 and how it revolutionizes NLP by using self-attention mechanisms. Understand the differences between transformers and LSTM, and discover how transfer learning with pre-trained models like BERT enhances NLP applications in spaCy.
We'll cover the following...
A milestone in NLP happened in 2017 with the release of the research paper
Transduction in this context means transforming input words to output words by transforming input words and sentences into vectors. Typically, a transformer is trained on a huge corpus such as Wiki or news. Then, in our ...