Advanced Gradient Optimization (Learning)
Learn advanced gradient optimization with the batch algorithm.
Batch Algorithm
Learning in machine learning means finding parameters of the model that minimize the loss function. There are many methods to minimize a function, and each one would constitute a learning algorithm. However, the workhorse in machine learning is usually some form of a gradient descent algorithm that we encountered earlier. Formally, the basic gradient descent minimizes the sum of the loss values over all training examples, which is called a batch algorithm, because all training examples build the batch for minimization. Let’s assume we have some training data. Then, gradient descent iterates the following equation:
Get hands-on with 1400+ tech skills courses.