How Gradient Boosting Works

Develop an intuitive understanding of how gradient boosting works.

Gradient boosting vs. random forests

Gradient boosting is a machine learning technique that employs an ensemble of weak learners, usually decision treesSmall decision tree models are also called weak learners.. Weak learners (models) are iteratively added to the ensemble in gradient boosting. Each model is trained to address errors that remain from the predictions of the previously added model. Gradient boosting ensembles make predictions by applying a weight to each weak learner’s prediction and then aggregating the results.

While gradient boosting and random forests are ensembles of decision trees, the algorithms are quite different. The following table summarizes the significant similarities and differences between the algorithms:

Get hands-on with 1400+ tech skills courses.