DataLoader

Learn about the DataLoader class and how that can bring several new changes in our implementation.

Introduction to DataLoader

Until now, we have used the whole training data at every training step. It has been batch gradient descent all along. This is fine for our small dataset, but if we want to make our work much more efficient and less computationally expensive, we must use mini-batch gradient descent. Thus, we need mini-batches, and we need to slice our dataset accordingly. Do you want to do it manually? Me neither!

So, we use PyTorch’s DataLoader class for this job. However, we have to tell it which dataset to use. In this case, we’ll select the dataset from the previous lesson, the desired mini-batch size, and if we would like to shuffle it or not. That’s it!

Get hands-on with 1400+ tech skills courses.