Optimal gradient tracking for decentralized optimization
成果类型:
Article
署名作者:
Song, Zhuoqing; Shi, Lei; Pu, Shi; Yan, Ming
署名单位:
Fudan University; Fudan University; The Chinese University of Hong Kong, Shenzhen
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-023-01997-7
发表日期:
2024
页码:
1-53
关键词:
Distributed optimization
CONVERGENCE
algorithms
extra
摘要:
In this paper, we focus on solving the decentralized optimization problem of minimizing the sum of n objective functions over a multi-agent network. The agents are embedded in an undirected graph where they can only send/receive information directly to/from their immediate neighbors. Assuming smooth and strongly convex objective functions, we propose an Optimal Gradient Tracking (OGT) method that achieves the optimal gradient computation complexity O (root kappa log 1/epsilon) and the optimal communication complexity O (root kappa/theta log 1/epsilon) simultaneously, where kappa and 1/theta denote the condition numbers related to the objective functions and the communication graph, respectively. To our best knowledge, OGT is the first single-loop decentralized gradient-type method that is optimal in both gradient computation and communication complexities. The development of OGT involves two building blocks that are also of independent interest. The first one is another new decentralized gradient tracking method termed Snapshot Gradient Tracking (SS-GT), which achieves the gradient computation and communication complexities of O (root kappa log 1/epsilon) and O (root kappa/theta log 1/epsilon) , respectively. SS-GT can be potentially extended to more general settings compared to OGT. The second one is a technique termed Loopless Chebyshev Acceleration (LCA), which can be implemented looplessly but achieves a similar effect by adding multiple inner loops of Chebyshev acceleration in the algorithm. In addition to SS- GT, this LCA technique can accelerate many other gradient tracking based methods with respect to the graph condition number 1/theta.