Padding
Understand the purpose of padding with respect to tokenized sequences.
We'll cover the following...
Chapter Goals:
- Learn about sequence lengths and padding
A. Varied length sequence
When dealing with most neural networks, the input data always has a fixed length. This is because most neural networks have what's known as a feed-forward structure, meaning that they utilize multiple layers of fixed sizes to compute the network's output.
However, since text data involves different sized text sequences (e.g. sentences, passages, etc.), our language model needs to be able to handle input data of varied lengths. Therefore, we use a recurrent ...