...

/

Understanding Multilingual BERT

Understanding Multilingual BERT

Learn about the multilingual BERT, its configurations, and how to use the pre-trained M-BERT.

BERT provides representation for only English text. Let's suppose we have an input text in a different language, say, French. Now, how can we use BERT to obtain a representation of the French text? Here is where we use Multilingual BERT (M-BERT).

Press + to interact

M-BERT

Multilingual BERT is used to obtain representations of text in different languages and not just English. We learned that the BERT model is trained with masked language modeling (MLM) and next sentence prediction (NSP) tasks using the English Wikipedia text and the Toronto BookCorpus. Similar to BERT, M-BERT is also trained with MLM and NSP tasks, but instead of using the Wikipedia text of only the English language, M-BERT is trained using the Wikipedia text of 104 different languages. ...