Quiz: Interpreting Black Box Transformer Models

Test yourself on the concepts you learned in this chapter.

1

(Select all that apply.) Which statements are not true?

A)

BertViz shows the attention heads of each layer of a BERT model.

B)

LIT shows the inner workings of the attention heads like BertViz does.

C)

PCA and UMAP are non-probing tasks.

D)

BertViz only shows the output of the last layer of the BERT model.

Question 1 of 50 attempted

Get hands-on with 1400+ tech skills courses.