Minibatch Gradient Descent
Explore how minibatch gradient descent improves optimization of non-convex problems by balancing update stability and speed. Understand the technique's use in handling large datasets, reducing variance compared to stochastic gradient descent, and practical implementation with Python libraries for machine learning tasks.
We'll cover the following...
We'll cover the following...
Stochastic gradient descent (SGD)
Recall that to compute the gradient