Timescale Separation in Autonomous Optimization
成果类型:
Article
署名作者:
Hauswirth, Adrian; Bolognani, Saverio; Hug, Gabriela; Dorfler, Florian
署名单位:
Swiss Federal Institutes of Technology Domain; ETH Zurich
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2020.2989274
发表日期:
2021
页码:
611-624
关键词:
Closed-loop systems
gradient methods
optimization
摘要:
Autonomous optimization refers to the design of feedback controllers that steer a physical system to a steady state that solves a predefined, possibly constrained, optimization problem. As such, no exogenous control inputs such as set points or trajectories are required. Instead, these controllers are modeled after optimization algorithms that take the form of dynamical systems. The interconnection of this type of optimization dynamics with a physical system is however not guaranteed to be stable unless both dynamics act on sufficiently different timescales. In this paper, we quantify the required timescale separation and give prescriptions that can be directly used in the design of this type of feedback controllers. Using ideas from singular perturbation analysis, we derive stability bounds for different feedback laws that are based on common continuous-time optimization schemes. In particular, we consider gradient descent and its variations, including projected gradient, and Newton gradient. We further give stability bounds for momentum methods and saddle-point flows. Finally, we discuss how optimization algorithms such as subgradient and accelerated gradient descent, while well-behaved in offline settings, are unsuitable for autonomous optimization due to their general lack of robustness.