Fixed-Time Stable Gradient Flows: Applications to Continuous-Time Optimization

成果类型:
Article
署名作者:
Garg, Kunal; Panagou, Dimitra
署名单位:
University of Michigan System; University of Michigan
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2020.3001436
发表日期:
2021
页码:
2002-2015
关键词:
Optimization CONVERGENCE linear programming Convex functions stability analysis Heuristic algorithms Newton method Finite and fixed-time stability Nonlinear systems optimization
摘要:
Continuous-time optimization is currently an active field of research in optimization theory; prior work in this area has yielded useful insights and elegant methods for proving stability and convergence properties of the continuous-time optimization algorithms. This article proposes novel gradient-flow schemes that yield convergence to the optimal point of a convex optimization problem within a fixed time from any given initial condition for unconstrained optimization, constrained optimization, and min-max problems. It is shown that the solution of the modified gradient-flow dynamics exists and is unique under certain regularity conditions on the objective function, while fixed-time convergence to the optimal point is shown via Lyapunov-based analysis. The application of the modified gradient flow to unconstrained optimization problems is studied under the assumption of gradient dominance, a relaxation of strong convexity. Then, a modified Newton's method is presented that exhibits fixed-time convergence under some mild conditions on the objective function. Building upon this method, a novel technique for solving convex optimization problems with linear equality constraints that yields convergence to the optimal point in fixed time is developed. Finally, the general min-max problem is considered, and a modified saddle-point dynamics to obtain the optimal solution in fixed time is developed.