On Centralized and Distributed Mirror Descent: Convergence Analysis Using Quadratic Constraints
成果类型:
Article
署名作者:
Sun, Youbang; Fazlyab, Mahyar; Shahrampour, Shahin
署名单位:
Northeastern University; Johns Hopkins University
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2022.3230767
发表日期:
2023
页码:
3139-3146
关键词:
convergence
optimization
mirrors
Lyapunov methods
Euclidean distance
Symmetric matrices
sun
Convex Optimization
decentralized optimization
mirror descent
optimization algorithms
quadratic constraints
Semidefinite programming
摘要:
Mirror descent (MD) is a powerful first-order optimization technique that subsumes several optimization algorithms including gradient descent (GD). In this work, we leverage quadratic constraints and Lyapunov functions to analyze the stability and characterize the convergence rate of the MD algorithm as well as its distributed variant using semidefinite programming (SDP). For both algorithms, we consider both strongly convex and nonstrongly convex assumptions. For centralized MD and strongly convex problems, we construct an SDP that certifies exponential convergence rates and derive a closed-form feasible solution to the SDP that recovers the optimal rate of GD as a special case. We complement our analysis by providing an explicit $O(1/k)$ convergence rate for convex problems. Next, we analyze the convergence of distributed MD and characterize the rate numerically using an SDP whose dimensions are independent of the network size. To the best of our knowledge, the numerical rate of distributed MD has not been previously reported in the literature. We further prove an $O(1/k)$ convergence rate for distributed MD in the convex setting. Our numerical experiments on strongly convex problems indicate that our framework certifies superior convergence rates compared to the existing rates for distributed GD.