A Stochastic Gradient Tracking Algorithm for Decentralized Optimization With Inexact Communication
成果类型:
Article
署名作者:
Shah, Suhail M.; Bollapragada, Raghu
署名单位:
University of Texas System; University of Texas Austin
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2025.3548470
发表日期:
2025
页码:
5864-5879
关键词:
NOISE
optimization
vectors
CONVERGENCE
Machine learning algorithms
Noise measurement
Machine Learning
training
Quantization (signal)
linear programming
distributed optimization
network optimization
optimization algorithms
摘要:
Decentralized optimization is typically studied under the assumption of noise-free transmission. However, real-world scenarios often involve the presence of noise due to factors such as additive white Gaussian noise channels or probabilistic quantization of transmitted data. These sources of noise have the potential to degrade the performance of decentralized optimization algorithms if not effectively addressed. In this article, we focus on the noisy communication setting and propose an algorithm that bridges the performance gap caused by communication noise while also mitigating other challenges like data heterogeneity. We establish theoretical results of the proposed algorithm that quantify the effect of communication noise and gradient noise on the performance of the algorithm. Notably, our algorithm achieves the optimal convergence rate for minimizing strongly convex, smooth functions in the context of inexact communication and stochastic gradients. Finally, we illustrate the superior performance of the proposed algorithm compared to its state-of-the-art counterparts on machine learning problems using MNIST and CIFAR-10 datasets.