Ensemble methods of machine learning combine several simple models with weak predicting power in order to get better predictions. Akin to the idea that two heads are better than one, these methods aggregate the results of many predictions. We'll look at a range of ensemble methods, including voting, averaging, weighted averaging, bagging and bootstrap aggregating, random forest, and adaptive boosting, along with some practical examples of how they are used.