Sponsored Links
-->

Wednesday, April 25, 2018

Lecture 16. Bagging Random Forest and Boosting¶ - ppt download
src: slideplayer.com

Out-of-bag (OOB) error, also called out-of-bag estimate, is a method of measuring the prediction error of random forests, boosted decision trees, and other machine learning models utilizing bootstrap aggregating (bagging) to sub-sample data samples used for training. OOB is the mean prediction error on each training sample x?, using only the trees that did not have x? in their bootstrap sample.

Subsampling allows one to define an out-of-bag estimate of the prediction performance improvement by evaluating predictions on those observations which were not used in the building of the next base learner. Out-of-bag estimates help avoid the need for an independent validation dataset, but often underestimates actual performance improvement and the optimal number of iterations.


Video Out-of-bag error



See also

  • Boosting (meta-algorithm)
  • Bootstrapping (statistics)
  • Cross-validation (statistics)
  • Random forest
  • Random subspace method (attribute bagging)

Maps Out-of-bag error



References


Source of article : Wikipedia