Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems

成果类型:
Article
署名作者:
Latafat, Puya; Themelis, Andreas; Patrinos, Panagiotis
署名单位:
KU Leuven; Kyushu University
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-020-01599-7
发表日期:
2022
页码:
195-224
关键词:
primal-dual algorithm descent method optimization CONVERGENCE minimization
摘要:
This paper analyzes block-coordinate proximal gradient methods for minimizing the sum of a separable smooth function and a (nonseparable) nonsmooth function, both of which are allowed to be nonconvex. The main tool in our analysis is the forward-backward envelope, which serves as a particularly suitable continuous and real-valued Lyapunov function. Global and linear convergence results are established when the cost function satisfies the Kurdyka-Lojasiewicz property without imposing convexity requirements on the smooth function. Two prominent special cases of the investigated setting are regularized finite sum minimization and the sharing problem; in particular, an immediate byproduct of our analysis leads to novel convergence results and rates for the popular Finito/MISO algorithm in the nonsmooth and nonconvex setting with very general sampling strategies.