A Momentum-Based Linearized Augmented Lagrangian Method for Nonconvex Constrained Stochastic Optimization

成果类型:
Article; Early Access
署名作者:
Shi, Qiankun; Wang, Xiao; Wang, Hao
署名单位:
Sun Yat Sen University; ShanghaiTech University
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2022.0193
发表日期:
2025
关键词:
gradient-method
摘要:
Nonconvex constrained stochastic optimization has emerged in many important application areas. Subject to general functional constraints, it minimizes the sum of an expectation function and a nonsmooth regularizer. Main challenges arise because of the stochasticity in the random integrand and the possibly nonconvex functional constraints. To address these issues, we propose a momentum-based linearized augmented Lagrangian method (MLALM). MLALM adopts a single-loop framework and incorporates a recursive momentum scheme to compute the stochastic gradient, which enables the construction of a stochastic approximation to the augmented Lagrangian function. We provide an analysis of global convergence of MLALM. Under mild conditions and with unbounded penalty parameters, we show that the sequences of average stationarity measure and constraint violations are convergent in expectation. Under a constraint qualification assumption, the sequences of average constraint violation and complementary slackness measure converge to zero in expectation. We also explore properties of those related metrics when penalty parameters are bounded. Furthermore, we investigate oracle complexities of MLALM in terms of the total number of stochastic gradient evaluations to find an epsilon-stationary point and an epsilon-Karush -Kuhn -Tucker point when assuming the constraint qualification. Numerical experiments on two types of test problems reveal promising performances of the proposed algorithm.
来源URL: