Ensemble Learning Part 2
Bagging and Boosting are the other famous ensemble techniques used extensively to improve the predictions of the models. You’ll learn more about them in this lesson.
We'll cover the following
Bagging or Bootstrap Aggregation
Bagging involves building several models using the subset of the original dataset, and then aggregating their individual predictions to make a final prediction. These methods involve introducing randomization during model construction, like Decision Trees, to reduce the model variance (overfitting). Bagging methods work well with complex models, such as Decision Trees that have a large depth. Bagging methods differ in the way that they draw subsets from the original dataset and are given the following names.
Types of bagging
-
When we draw the random subsets of the dataset as the random subset of the samples then this technique is known as Pasting.
-
When we draw the samples with replacement then this is known as Bootstrapping. Drawing the samples with replacement means once the sample has been drawn from the dataset, it’s properties are noticed and then it is put back in the same dataset. There is a chance this sample might be drawn again.
-
When we draw the random subsets of the dataset as the random subset of the features then this technique is known as Random Subspaces.
-
When we build our models using the subsets of both features and samples, then this method is known as Random Patches.
Get hands-on with 1400+ tech skills courses.