A Variance-Reduced Aggregation Based Gradient Tracking Method for Distributed Optimization Over Directed Networks

成果类型:
Article
署名作者:
Zhao, Shengchao; Song, Siyuan; Liu, Yongchao
署名单位:
China University of Mining & Technology; The Chinese University of Hong Kong, Shenzhen; Dalian University of Technology
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2025.3527737
发表日期:
2025
页码:
4109-4115
关键词:
NOISE CONVERGENCE linear programming optimization Noise measurement Level set vectors Quantization (signal) Machine learning algorithms Directed graphs Directed networks gradient tracking noisy information-sharing variance-reduced aggregation
摘要:
This article studies the distributed optimization problem over directed networks with noisy information-sharing. To resolve the imperfect communication issue over directed networks, a series of noise-robust variants of Push-Pull/AB method have been developed. These methods improve the robustness of Push-Pull method against the information-sharing noise through adding small factors on weight matrices and replacing the global gradient tracking with the cumulative gradient tracking. Based on the two techniques, we propose a new variant of the push-pull method by presenting a novel mechanism of interagent information aggregation, named variance-reduced aggregation (VRA). VRA helps us to release some conditions on the objective function and networks. When the objective function is convex and the sharing-information noise is variance-unbounded, it can be shown that the proposed method converges to the optimal solution almost surely. When the objective function is strongly convex and the sharing-information noise is variance-bounded, the proposed method achieves the convergence rate of O(k(-1+epsilon)) in the mean square sense, where epsilon could be close to 0 infinitely. Simulated experiments on ridge regression and logistic regression problems verify the effectiveness of the proposed method.
来源URL: