Ensembling

Last revised by Daniel J Bell on 4 Jul 2020

Ensembling (sometimes ensemble learning) is a class of meta-algorithmic techniques where multiple models are trained and their results are aggregated to improve classification performance. It is effective in a wide variety of problems. 

Two commonly used methods are: 

  • boosting: a method of weighting the predictions of multiple models such that the combined prediction is more accurate than each individual model
  • bagging: also known as bootstrap aggregating, where multiple models are trained on random subsets of the entire training dataset, and the results of these models are averaged

ADVERTISEMENT: Supporters see fewer/no ads

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.