Quiz: BERT Variants—Based on Knowledge Distillation
Take a short quiz to test your understanding of different BERT variants based on knowledge distillation.
We'll cover the following...
1
(Select all that apply.) When applying softmax temperature in the output layer during knowledge distillation, what will a higher temperature value result in?
A)
Sharper probability distributions
B)
Smoother probability distributions
C)
Confident probability distributions
D)
Uncertain probability distributions
Question 1 of 80 attempted
Access this course and 1400+ top-rated courses and projects.