Quiz: Using Transformers to Generate Text
Test your understanding of how transformers are used to generate text.
1
Which statement best describes attention?
A)
Attention is limited to Neural Machine Translation use cases only.
B)
The attention mechanism uses all interim hidden states of the RNN to decide which one to focus on before it is used by the decoding stage.
C)
Attention is a technique used to compute a weighted sum of the values, independent of the query.
D)
Attention takes in a context window of a defined size as input and encodes all of it into a single vector.
Question 1 of 50 attempted
Get hands-on with 1400+ tech skills courses.