Ensemble modeling is a helpful and important technique used in machine learning. It's a
powerful approach to train multiple models and quantify them into a single prediction. There
are three commonly used ensemble techniques: stacking, bagging, and boosting. So how do
you know which ensemble method to use and when to use it?
In this course, you will explore stacking, bagging, and boosting techniques, including the
motivation behind using each and understanding their optimal scenarios as well as their
tradeoffs. By the end of this course, you will have observed a number of robust algorithm
case studies, such as random forests and gradient boosted decision trees, that employ these
methods. You will also have the opportunity to put this new knowledge into action by
practicing building and optimizing various ensemble models.
You are required to have completed the following courses or have equivalent experience
before taking this course:
- Machine Learning Foundations
- Managing Data in Machine Learning
- Training Common Machine Learning Models
- Training Linear Models
- Evaluating and Improving Your Mode