An Stochastic Differential Equation Perspective on Stochastic Convex Optimization
成果类型:
Article; Early Access
署名作者:
Maule, Rodrigo; Fadili, Jalal; Attouch, Hedy
署名单位:
Universite de Caen Normandie; Centre National de la Recherche Scientifique (CNRS); Centre National de la Recherche Scientifique (CNRS); Universite de Montpellier
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2022.0162
发表日期:
2024
关键词:
error-bounds
descent methods
CONVERGENCE
摘要:
In this paper, we analyze the global and local behavior of gradient-like flows under stochastic errors toward the aim of solving convex optimization problems with noisy gradient input. We first study the unconstrained differentiable convex case, using a stochastic differential equation where the drift term is minus the gradient of the objective function and the diffusion term is either bounded or square-integrable. In this context, under Lipschitz continuity of the gradient, our first main result shows almost sure convergence of the objective and the trajectory process toward a minimizer of the objective function. We also provide a comprehensive complexity analysis by establishing several new pointwise and ergodic convergence rates in expectation for the convex, strongly convex, and (local) & Lstrok;ojasiewicz case. The last involves a challenging local analysis which requires nontrivial arguments from measure theory. Then, we extend our study to the constrained case and more generally to nonsmooth problems. We show that several of our results have natural extensions obtained by replacing the gradient of the objective function by a cocoercive monotone operator. This makes it possible to obtain similar convergence results for optimization problems with an additively smooth + nonsmooth convex structure. Finally, we consider another extension of our results to nonsmooth optimization which is based on the Moreau envelope.
来源URL: