Gradient Boosting: Implementation from Scratch

Get introduced to gradient boosting, its algorithm, and how we can train our data from scratch.

Gradient boosting is a machine learning technique that builds a strong predictive model by sequentially adding weak models to the ensemble. It uses gradient descent optimization to minimize the loss function of the model. The term “gradient” refers to the negative gradient of the loss function, which guides the learning process toward reducing errors. Gradient boosting is a versatile technique that can be used for both regression and classification tasks. In this case, we’ll focus on regression and demonstrate how to solve a regression problem using two approaches. Currently, we’re implementing a gradient boosting regressor from scratch.

Get hands-on with 1400+ tech skills courses.