Boosting the margin: A new explanation for the effectiveness of voting methods
成果类型:
Article
署名作者:
Schapire, RE; Freund, Y; Bartlett, P; Lee, WS
署名单位:
AT&T; Australian National University; Australian Defense Force Academy; University of New South Wales Sydney
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
发表日期:
1998
页码:
1651-1686
关键词:
Approximation
networks
bounds
rates
摘要:
One of the surprising recurring phenomena observed in experiments with boosting is that the test error of the generated classifier usually does not increase as its size becomes very large, and often is observed to decrease even after the training error reaches zero. In this paper, we show that this phenomenon is related to the distribution of margins of the training examples with respect to the generated voting classification rule, where the margin of an example is simply the difference between the number of correct votes and the maximum number of votes received by any incorrect label. We show that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. We also show theoretically and experimentally that boosting is especially effective at increasing the margins of the training examples. Finally, we compare our explanation to those based on the bias-variance decomposition.