ASYMPTOTIC BIAS OF STOCHASTIC GRADIENT SEARCH

成果类型:
Article
署名作者:
Tadic, Vladislav B.; Doucet, Arnaud
署名单位:
University of Bristol; University of Oxford
刊物名称:
ANNALS OF APPLIED PROBABILITY
ISSN/ISSBN:
1050-5164
DOI:
10.1214/16-AAP1272
发表日期:
2017
页码:
3255-3304
关键词:
Convergence rate approximation
摘要:
The asymptotic behavior of the stochastic gradient algorithm using biased gradient estimates is analyzed. Relying on arguments based on dynamic system theory (chain-recurrence) and differential geometry (Yomdin theorem and Lojasiewicz inequalities), upper bounds on the asymptotic bias of this algorithm are derived. The results hold under mild conditions and cover a broad class of algorithms used in machine learning, signal processing and statistics.