Working of BERT
Learn how BERT is bidirectional with detailed examples.
We'll cover the following...
Bidirectional Encoder Representation from Transformer (BERT), as the name suggests, is based on the transformer model. We can perceive BERT as the transformer, but only with the encoder.
In transformers, we feed the sentence as input to the transformer's encoder, and it returns the representation for each word in the sentence as an output. Well, that's exactly what BERT is—an encoder representation from transformer. Okay, so what about the term Bidirectional?
How BERT is bidirectional
Access this course and 1400+ top-rated courses and projects.