Gradient Descent: Stochastic vs. Batch Update
Learn about the stochastic vs. batch update nature of the gradient descent.
We'll cover the following...
Gradient descent
We have learned that gradient descent helps reduce error by derivating the error with respect to weight. We add the weight change to the initial weight and continue updating it until we reach the bottom of the slope where the error is minimum.
Gradient descent
The weight update can occur in a batch or a stochastic manner. Learn about each of them in the following sections:
Stochastic gradient
...Access this course and 1400+ top-rated courses and projects.