Fast Convex Optimization via Time Scale and Averaging of the Steepest Descent

成果类型:
Article; Early Access
署名作者:
Attouch, Hedy; Bot, Radu Ioan; Nguyen, Dang-Khoa
署名单位:
Universite de Montpellier; Centre National de la Recherche Scientifique (CNRS); University of Vienna; Vietnam National University Ho Chi Minh City (VNUHCM) System
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2023.0186
发表日期:
2024
关键词:
monotone inclusions dynamical-system inertial dynamics Newton method CONVERGENCE equation
摘要:
In a Hilbert setting, we develop a gradient-based dynamic approach for fast solving convex optimization problems. By applying time scaling, averaging, and perturbation techniques to the continuous steepest descent (SD), we obtain high-resolution ordinary differential equations of the Nesterov and Ravine methods. These dynamics involve asymptotically vanishing viscous damping and Hessian-driven damping (either in explicit or implicit form). Mathematical analysis does not require developing a Lyapunov analysis for inertial systems. We simply exploit classical convergence results for SD and its external perturbation version, then use tools of differential and integral calculus, including Jensen's inequality. The method is flexible, and by way of illustration, we show how it applies starting from other important dynamics in optimization. We consider the case in which the initial dynamic is the regularized Newton method, then the case in which the starting dynamic is the differential inclusion associated with a convex lower semicontinuous potential, and finally we show that the technique can be naturally extended to the case of a monotone cocoercive operator. Our approach leads to parallel algorithmic results, which we study in the case of fast gradient and proximal algorithms. Our averaging technique shows new links between the Nesterov and Ravine methods.
来源URL: