• Media type: E-Article
  • Title: Boosting the Generalized Margin in Cost-Sensitive Multiclass Classification
  • Contributor: Wang, Junhui
  • imprint: American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America, 2013
  • Published in: Journal of Computational and Graphical Statistics
  • Language: English
  • ISSN: 1061-8600
  • Keywords: Miscellany
  • Origination:
  • Footnote:
  • Description: <p>The boosting algorithm is one of the most successful binary classification techniques due to its relative immunity to overfitting and flexible implementation. Several attempts have been made to extend the binary boosting algorithm to multiclass classification. In this article, a novel cost-sensitive multiclass boosting algorithm is proposed that naturally extends the popular binary AdaBoost algorithm and admits unequal misclassification costs. The proposed multiclass boosting algorithm achieves superior classification performance by combining weak candidate models that only need to be better than random guessing. More importantly, the proposed algorithm achieves a large margin separation of the training sample while attaining an L₁-norm constraint on the model complexity. Finally, the effectiveness of the proposed algorithm is demonstrated in a number of simulated and real experiments. The supplementary files are available online, including the technical proofs, the implemented R code, and the real datasets.</p>