Fast learning rates for plug-in classifiers

成果类型:
Article
署名作者:
Audibert, Jean-Yves; Tsybakov, Alexandre B.
署名单位:
Institut Polytechnique de Paris; Ecole Nationale des Ponts et Chaussees; Sorbonne Universite
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/009053606000001217
发表日期:
2007
页码:
608-633
关键词:
Classification CONVERGENCE
摘要:
It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n(-1/2). The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n(-1), and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n-1. We establish minimax lower bounds showing that the obtained rates cannot be improved.