Quiz

We'll cover the following...
1

Why do we use word embeddings?

A)

They’re used to represent a sequence of words as a single continuous vector

B)

Word embeddings are easier to look up than regular tokenized IDs

C)

It makes training models for NLP tasks quicker and more efficient

D)

As a more meaningful way to capture the connections between vocabulary words

Question 1 of 40 attempted