Quiz: Applying BERT to Other Languages

Take a short quiz to test your understanding of the multilingual BERT model.

1

What is the primary motivation behind oversampling and undersampling in multilingual BERT?

A)

To prioritize high-resource languages for training.

B)

To exclude the low-resource languages from training.

C)

To maintain a balanced data distribution in each language.

D)

To randomly shuffle the training data to achieve balance.

Question 1 of 80 attempted

Get hands-on with 1400+ tech skills courses.