On the Bayes-risk consistency of regularized boosting methods
成果类型:
Article
署名作者:
Lugosi, G; Vayatis, N
署名单位:
Pompeu Fabra University; Sorbonne Universite; Universite Paris Cite
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
发表日期:
2004
页码:
30-55
关键词:
additive logistic-regression
statistical view
CLASSIFIERS
margin
摘要:
The probability of error of classification methods based on convex combinations of simple base classifiers by boosting algorithms is investigated. The main result of the paper is that certain regularized boosting algorithms provide Bayes-risk consistent classifiers under the sole assumption that the Bayes classifier may be approximated by a convex combination of the base classifiers. Nonasymptotic distribution-free bounds are also developed which offer interesting new insight into how boosting works and help explain its success in practical classification problems.