From differential equation solvers to accelerated first-order methods for convex optimization
成果类型:
Article
署名作者:
Luo, Hao; Chen, Long
署名单位:
Sichuan University; University of California System; University of California Irvine
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-021-01713-3
发表日期:
2022
页码:
735-781
关键词:
heavy ball
SYSTEM
algorithms
gradient
摘要:
Convergence analysis of accelerated first-order methods for convex optimization problems are developed from the point of view of ordinary differential equation solvers. A new dynamical system, called Nesterov accelerated gradient (NAG) flow, is derived from the connection between acceleration mechanism and A-stability of ODE solvers, and the exponential decay of a tailored Lyapunov function along with the solution trajectory is proved. Numerical discretizations of NAG flow are then considered and convergence rates are established via a discrete Lyapunov function. The proposed differential equation solver approach can not only cover existing accelerated methods, such as FISTA, Guler's proximal algorithm and Nesterov's accelerated gradient method, but also produce new algorithms for composite convex optimization that possess accelerated convergence rates. Both the convex and the strongly convex cases are handled in a unified way in our approach.