Skip to main content
padlock icon - secure page this page is secure

Weighted bagging: a modification of AdaBoost from the perspective of importance sampling

Buy Article:

$61.00 + tax (Refund Policy)

We motivate the success of AdaBoost (ADA) in classification problems by appealing to an importance sampling perspective. Based on this insight, we propose the Weighted Bagging (WB) algorithm, a regularization method that naturally extends ADA to solve both classification and regression problems. WB uses a part of the available data to build models, and a separate part to modify the weights of observations. The method is used with categorical and regression tress and is compared with ADA, Boosting, Bagging, Random Forest and Support Vector Machine. We apply these methods to some real data sets and report some results of simulations. These applications and simulations show the effectiveness of WB.
No Reference information available - sign in for access.
No Citation information available - sign in for access.
No Supplementary Data.
No Article Media
No Metrics

Keywords: AdaBoost; bagging; categorical and regression trees; ensemble learning; gradient-descent boosting

Document Type: Research Article

Affiliations: School of Public Health, Louisiana State University Health Science Center, New Orleans, LA, USA

Publication date: March 1, 2011

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more