NONPARAMETRIC STOCHASTIC APPROXIMATION WITH LARGE STEP-SIZES
成果类型:
Article
署名作者:
Dieuleveut, Aymeric; Bach, Francis
署名单位:
Centre National de la Recherche Scientifique (CNRS); Universite PSL; Ecole Normale Superieure (ENS)
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/15-AOS1391
发表日期:
2016
页码:
1363-1399
关键词:
learning-theory
online
摘要:
We consider the random-design least-squares regression problem within the reproducing kernel Hilbert space (RKHS) framework. Given a stream of independent and identically distributed input/output data, we aim to learn a regression function within an RKHS H, even if the optimal predictor (i.e., the conditional expectation) is not in H. In a stochastic approximation framework where the estimator is updated after each observation, we show that the averaged unregularized least-mean-square algorithm (a form of stochastic gradient descent), given a sufficient large step-size, attains optimal rates of convergence for a variety of regimes for the smoothnesses of the optimal prediction function and the functions in H. Our results apply as well in the usual finite-dimensional setting of parametric least-squares regression, showing adaptivity of our estimator to the spectral decay of the covariance matrix of the covariates.