Nonlinear acceleration of momentum and primal-dual algorithms

成果类型:
Article
署名作者:
Bollapragada, Raghu; Scieur, Damien; D'Aspremont, Alexandre
署名单位:
University of Texas System; University of Texas Austin; Inria; Centre National de la Recherche Scientifique (CNRS); CNRS - Institute for Information Sciences & Technologies (INS2I); Universite PSL; Ecole Normale Superieure (ENS); Universite PSL; Ecole Normale Superieure (ENS); Centre National de la Recherche Scientifique (CNRS)
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-022-01775-x
发表日期:
2023
页码:
325-362
关键词:
anderson acceleration CONVERGENCE VALUES field gmres
摘要:
We describe convergence acceleration schemes for multistep optimization algorithms where the underlying fixed-point operator is not symmetric. In particular, our analysis handles algorithms with momentum terms such as Nesterov's accelerated method or primal-dual methods. The acceleration technique combines previous iterates through a weighted sum, whose coefficients are computed via a simple linear system. We analyze performance in both online and offline modes, and we study in particular a variant of Nesterov's method that uses nonlinear acceleration at each iteration. We use Crouzeix's conjecture to show that acceleration performance is controlled by the solution of a Chebyshev problem on the numerical range of a non-symmetric operator modeling the behavior of iterates near the optimum. Numerical experiments are detailed on logistic regression problems.
来源URL: