Gradient Descent
Master gradient descent algorithm for minimizing loss with the right learning rate and initialization in this lesson.
We'll cover the following
What is gradient descent?
Gradient descent is an optimization algorithm used for finding the minimum of a function. It’s an iterative algorithm that starts from an initial guess of the minimum and then takes steps in the direction of the negative
Gradient descent algorithm
Here are the steps involved in the gradient descent algorithm:
-
Choose an initial guess for the minimum of the function.
-
Calculate the gradient of the function at the current guess. This involves taking the partial derivative of the function with respect to each of the input variables.
-
Multiply the gradient by a small positive number called the learning rate, which determines the step size of the algorithm.
-
Subtract the result from the current guess to obtain a new guess for the minimum.
-
Repeat steps 2–4 until the algorithm converges to a minimum.
Get hands-on with 1200+ tech skills courses.