Gradient Descent: Stochastic vs. Batch Update
Learn about the stochastic vs. batch update nature of the gradient descent.
We'll cover the following
Gradient descent
We have learned that gradient descent helps reduce error by derivating the error with respect to weight. We add the weight change to the initial weight and continue updating it until we reach the bottom of the slope where the error is minimum.
Gradient descent
Get hands-on with 1400+ tech skills courses.