CONCENTRATION OF CONTRACTIVE STOCHASTIC APPROXIMATION: ADDITIVE AND MULTIPLICATIVE NOISE

成果类型:
Article
署名作者:
Chen, Zaiwei; Maguluri, Siva Theja; Zubeldia, Martin
署名单位:
Purdue University System; Purdue University; University System of Georgia; Georgia Institute of Technology; University of Minnesota System; University of Minnesota Twin Cities
刊物名称:
ANNALS OF APPLIED PROBABILITY
ISSN/ISSBN:
1050-5164
DOI:
10.1214/24-AAP2143
发表日期:
2025
页码:
1298-1352
关键词:
Bounds error
摘要:
In this paper, we establish maximal concentration bounds for the iterates generated by a stochastic approximation (SA) algorithm under a contractive operator with respect to some arbitrary norm (e.g., the pound infinity-norm). We consider two settings where the iterates are potentially unbounded: SA with bounded multiplicative noise and SA with sub-Gaussian additive noise. Our maximal concentration inequalities state that the convergence error has a sub-Gaussian tail in the additive noise setting and a Weibull tail (which is faster than polynomial decay but could be slower than exponential decay) in the multiplicative noise setting. In addition, we provide an impossibility result showing that it is generally impossible to have sub-exponential tails under multiplicative noise. To establish the maximal concentration bounds, we develop a novel bootstrapping argument that involves bounding the momentgenerating function of a modified version of the generalized Moreau envelope of the convergence error and constructing an exponential supermartingale to enable using Ville's maximal inequality. We demonstrate the applicability of our theoretical results in the context of linear SA and reinforcement learning.