BRIDGING THE GAP BETWEEN CONSTANT STEP SIZE STOCHASTIC GRADIENT DESCENT AND MARKOV CHAINS

成果类型:
Article
署名作者:
Dieuleveut, Aymeric; Durmus, Alain; Bach, Francis
署名单位:
Institut Polytechnique de Paris; Ecole Polytechnique; Centre National de la Recherche Scientifique (CNRS); Universite Paris Saclay; Universite PSL; Ecole Normale Superieure (ENS); Inria; Centre National de la Recherche Scientifique (CNRS)
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/19-AOS1850
发表日期:
2020
页码:
1348-1382
关键词:
convergence approximation algorithm schemes kushner
摘要:
We consider the minimization of a strongly convex objective function given access to unbiased estimates of its gradient through stochastic gradient descent (SGD) with constant step size. While the detailed analysis was only performed for quadratic functions, we provide an explicit asymptotic expansion of the moments of the averaged SGD iterates that outlines the dependence on initial conditions, the effect of noise and the step size, as well as the lack of convergence in the general (nonquadratic) case. For this analysis we bring tools from Markov chain theory into the analysis of stochastic gradient. We then show that Richardson-Romberg extrapolation may be used to get closer to the global optimum, and we show empirical improvements of the new extrapolation scheme.