Explain the difference between bagging and boosting models.

Medium Last updated on Aug. 29, 2022, 9:46 p.m.

Before going over details of Bagging vs Boosting, we suggest you to read What is Ensemble Learning? How many types of ensemble methods are there?

Bagging and boosting are the two commonly used ensemble techniques. They can be used to solve a wide range of problems such as data scarcity and data imbalance, and help build a generalized model. However, in order to determine which of these technique would be more suitable to a use case we need to understand their key difference.

Here are some of the most important differences between bagging and boosting:

Markdown Monster icon

How to select a Bagging or a Boosting approach?

This is a common question to get asked when solving problems using these ensemble techniques. It completely depends on the circumstances and the simulation that is a result of the data that helps in making the assessment. If the performance of the single base model is very low, then bagging will not help achieve better bias. In this case, boosting can generate a combined model that brings down the error significantly to help cover for the inefficiency of the base model. However, if the single base model is overfitting to a greater degree, then bagging is preferred over boosting.