CEDAS: A Compressed Decentralized Stochastic Gradient Method With Improved Convergence
成果类型:
Article
署名作者:
Huang, Kun; Pu, Shi
署名单位:
The Chinese University of Hong Kong, Shenzhen
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2024.3471854
发表日期:
2025
页码:
2242-2257
关键词:
linear programming
transient analysis
CONVERGENCE
gradient methods
Compressors
optimization
Stochastic processes
Radio frequency
lead
Signal to noise ratio
Compressed gradient methods
Convex Optimization
distributed optimization
Nonconvex Optimization
stochastic gradient methods
摘要:
In this article, we consider solving the distributed optimization problem over a multiagent network under the communication restricted setting. We study a compressed decentralized stochastic gradient method, termed compressed exact diffusion with adaptive stepsizes (CEDAS), and show that the method asymptotically achieves comparable convergence rate as centralized stochastic gradient descent (SGD) for both smooth strongly convex objective functions and smooth nonconvex objective functions under unbiased compression operators. In particular, to our knowledge, CEDAS enjoys so far the shortest transient time (with respect to the graph specifics) for achieving the convergence rate of centralized SGD, which behaves as O(nC(3)/(1-lambda(2))(2)) under smooth strongly convex objective functions and O(n3C6/(1-lambda(2))(4)) under smooth nonconvex objective functions, where (1-lambda(2)) denotes the spectral gap of the mixing matrix, and C>0 is the compression-related parameter. In particular, CEDAS exhibits the shortest transient times when C
来源URL: