Risk bounds for statistical learning

成果类型:
Article
署名作者:
Massart, Pascal; Nedelec, Elodie
署名单位:
Universite Paris Saclay
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/009053606000000786
发表日期:
2006
页码:
2326-2366
关键词:
Concentration Inequalities Empirical Processes CONVERGENCE rates
摘要:
We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM). We essentially focus on the binary classification framework. We extend Tsybakov's analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with ways of measuring the size of a class of classifiers other than entropy with bracketing as in Tsybakov's work. In particular, we derive new risk bounds for the ERM when the classification rules belong to some VC-class under margin conditions and discuss the optimality of these bounds in a minimax sense.
来源URL: