Generalized Gradient Flows With Provable Fixed-Time Convergence and Fast Evasion of Non-Degenerate Saddle Points
成果类型:
Article
署名作者:
Baranwal, Mayank; Budhraja, Param; Raj, Vishal; Hota, Ashish R.
署名单位:
Tata Sons; Tata Consultancy Services Limited (TCS); Boston University; Indian Institute of Technology System (IIT System); Indian Institute of Technology (IIT) - Kharagpur
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2023.3328510
发表日期:
2024
页码:
2281-2293
关键词:
convergence
Heuristic algorithms
Stability criteria
dynamical systems
cost function
training
Neural Networks
Accelerated optimization
continuous-time optimization
fixed-time convergence
minimax problem
saddle point evasion
摘要:
Gradient-based first-order convex optimization algorithms find widespread applicability in a variety of domains, including machine learning tasks. Motivated by the recent advances in fixed-time stability theory of continuous-time dynamical systems, we introduce a generalized framework for designing accelerated optimization algorithms with strongest convergence guarantees that further extend to a subclass of nonconvex functions. In particular, we introduce the GenFlow algorithm and its momentum variant that provably converge to the optimal solution of objective functions satisfying the Polyak-Lojasiewicz inequality in a fixed time. Moreover, for functions that admit nondegenerate saddle points, we show that for the proposed GenFlow algorithm, the time required to evade these saddle points is uniformly bounded for all initial conditions. Finally, for strongly convex-strongly concave minimax problems whose optimal solution is a saddle point, a similar scheme is shown to arrive at the optimal solution again in a fixed time. The superior convergence properties of our algorithm are validated experimentally on a variety of benchmark datasets.