Nonasymptotic Bounds for Stochastic Optimization With Biased Noisy Gradient Oracles

成果类型:
Article
署名作者:
Bhavsar, Nirav; Prashanth, L. A.
署名单位:
Indian Institute of Technology System (IIT System); Indian Institute of Technology (IIT) - Madras
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2022.3159748
发表日期:
2023
页码:
1628-1641
关键词:
STOCHASTIC PROCESSES optimization estimation error Complexity theory Perturbation methods Noise measurement computational modeling Biased gradient oracle Gaussian smoothing nonasymptotic bounds simultaneous perturbation (SP) zeroth-order stochastic optimization
摘要:
We introduce biased gradient oracles to capture a setting where the function measurements have an estimation error that can be controlled through a batch size parameter. Our proposed oracles are appealing in several practical contexts, for instance, risk measure estimation from a batch of independent and identically distributed samples, or simulation optimization, where the function measurements are biased due to computational constraints. In either case, increasing the batch size reduces the estimation error. We highlight the applicability of our biased gradient oracles in a risk-sensitive reinforcement learning setting. In the stochastic nonconvex optimization context, we analyze a variant of the randomized stochastic gradient algorithm with a biased gradient oracle. We quantify the convergence rate of this algorithm by deriving nonasymptotic bounds on its performance. Next, in the stochastic convex optimization setting, we derive nonasymptotic bounds for the last iterate of a stochastic gradient descent algorithm with a biased gradient oracle.
来源URL: