Mini-Batch Inner Loop and Training Split
Learn about how you can implement the inner loop of mini-batch, add it into your training loop, and get a glimpse at using "random_split".
We'll cover the following
The inner loop
From now on, it is very unlikely that you will ever use a (full) batch gradient descent again, both in this course or in real life. So once again, it makes sense to organize a piece of code that is going to be used repeatedly into its own function: the mini-batch inner loop!
The inner loop depends on three elements:
-
The device where data is being sent to.
-
A data loader to draw mini-batches from.
-
A step function, returning the corresponding loss.
Taking these elements as inputs and using them to perform the inner loop, we will end up with a function like this:
Get hands-on with 1400+ tech skills courses.