Quiz: BERT Variants—Based on Knowledge Distillation

Take a short quiz to test your understanding of different BERT variants based on knowledge distillation.

1

(Select all that apply.) When applying softmax temperature in the output layer during knowledge distillation, what will a higher temperature value result in?

A)

Sharper probability distributions

B)

Smoother probability distributions

C)

Confident probability distributions

D)

Uncertain probability distributions

Question 1 of 80 attempted

Get hands-on with 1400+ tech skills courses.