/

On the Dual Formulation of Boosting Algorithms

Chunhua Shen, Hanxi Li
Arxiv ID: 0901.3590Last updated: 5/30/2023
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of AdaBoost, LogitBoost and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maximizing margins and at the same time controlling the margin variance.We also theoretically prove that, approximately, AdaBoost maximizes the average margin, instead of the minimum margin. The duality formulation also enables us to develop column generation based optimization algorithms, which are totally corrective. We show that they exhibit almost identical classification results to that of standard stage-wise additive boosting algorithms but with much faster convergence rates. Therefore fewer weak classifiers are needed to build the ensemble using our proposed optimization technique.

PaperStudio AI Chat

I'm your research assistant! Ask me anything about this paper.
About
Pricing
Commercial Disclosure
Contact
© 2023 Paper Studio™. All Rights Reserved.