Ensemble Learning Part 2
Bagging and Boosting are the other famous ensemble techniques used extensively to improve the predictions of the models. You’ll learn more about them in this lesson.
We'll cover the following...
Bagging or Bootstrap Aggregation
Bagging involves building several models using the subset of the original dataset, and then aggregating their individual predictions to make a final prediction. These methods involve introducing randomization during model construction, like Decision Trees, to reduce the model variance (overfitting). Bagging methods work well with complex models, such as Decision Trees that have a large depth. Bagging methods differ in the way that they draw subsets from the original dataset and are given the following names.
Types of bagging
-
When we draw the random subsets of the dataset as the random subset of the samples then this technique is known as Pasting.
-
When we ...