Efficiency of Stochastic Coordinate Proximal Gradient Methods on Nonseparable Composite Optimization
成果类型:
Article
署名作者:
Necoara, Ion; Chorobura, Flavia
署名单位:
National University of Science & Technology POLITEHNICA Bucharest; Romanian Academy
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2023.0044
发表日期:
2025
关键词:
convex-optimization
descent algorithm
CONVERGENCE
parallel
摘要:
This paper deals with composite optimization problems having the objective function formed as the sum of two terms; one has a Lipschitz continuous gradient along random subspaces and may be nonconvex, and the second term is simple and differentiable but possibly nonconvex and nonseparable. Under these settings, we design a stochastic coordinate proximal gradient method that takes into account the nonseparable composite form of the objective function. This algorithm achieves scalability by constructing at each iteration a local approximation model of the whole nonseparable objective function along a random subspace with user -determined dimension. We outline efficient techniques for selecting the random subspace, yielding an implementation that has low cost per iteration, also achieving fast convergence rates. We present a probabilistic worst case complexity analysis for our stochastic coordinate proximal gradient method in convex and nonconvex settings; in particular, we prove high -probability bounds on the number of iterations before a given optimality is achieved. Extensive numerical results also confirm the efficiency of our algorithm.