Learning without concentration for general loss functions
成果类型:
Article
署名作者:
Mendelson, Shahar
署名单位:
Technion Israel Institute of Technology; Australian National University
刊物名称:
PROBABILITY THEORY AND RELATED FIELDS
ISSN/ISSBN:
0178-8051
DOI:
10.1007/s00440-017-0784-y
发表日期:
2018
页码:
459-502
关键词:
rates
摘要:
We study the performance of empirical risk minimization in prediction and estimation problems that are carried out in a convex class and relative to a sufficiently smooth convex loss function. The framework is based on the small-ball method and thus is suited for heavy-tailed problems. Moreover, among its outcomes is that a well-chosen loss, calibrated to fit the noise level of the problem, negates some of the ill-effects of outliers and boosts the confidence level-leading to a gaussian like behaviour even when the target random variable is heavy-tailed.