A Second-Order Proximal Algorithm for Consensus Optimization
成果类型:
Article
署名作者:
Wu, Xuyang; Qu, Zhihai; Lu, Jie
署名单位:
ShanghaiTech University
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2020.2996205
发表日期:
2021
页码:
1864-1871
关键词:
convergence
cost function
Lagrangian functions
Couplings
Machine learning algorithms
Machine Learning
Consensus optimization
distributed optimization
proximal algorithm
second-order method
摘要:
We develop a distributed second-order proximal algorithm, referred to as SoPro, to address in-network consensus optimization. The proposed SoPro algorithm converges linearly to the exact optimal solution, provided that the global cost function is locally restricted strongly convex. This relaxes the standard global strong convexity condition required by the existing distributed optimization algorithms to establish linear convergence. In addition, we demonstrate that SoPro is computation- and communication-efficient in comparison with the state-of-the-art distributed second-order methods. Finally, extensive simulations illustrate the competitive convergence performance of SoPro.