Comparing Decision Tree Algorithms: Random Forest® vs. XGBoost
Check out this tutorial walking you through a comparison of XGBoost and Random Forest. You'll learn how to create a decision tree, how to do tree bagging, and how to do tree boosting.
This tutorial walks you through a comparison of XGBoost and Random Forest, two popular decision tree algorithms, and helps you identify the best use cases for ensemble techniques like bagging and boosting.
By following the tutorial, you’ll learn:
- How to create a decision tree using Python and Pandas
- How to do tree bagging with sklearn’s RandomForestClassifier
- How to do tree boosting with XGBoost
Understanding the benefits of bagging and boosting—and knowing when to use which technique—will lead to less variance, lower bias, and more stability in your machine learning models. Try it for yourself!