BETO for Spanish
Learn about BETO and how to use it to predict masked words.
We'll cover the following...
BETO is the pre-trained BERT model for the Spanish language from the Universidad de Chile. It is trained using the MLM task with Whole World Masking (WWM). The configuration of BETO is the same as the standard BERT-base model.
Variants of BETO
The researchers of BETO provided two variants of the BETO model.
BETO-cased for the cased text.
BETO-uncased for the uncased text. ...