An Improved Distributed Nesterov Gradient Tracking Algorithm for Smooth Convex Optimization Over Directed Networks

成果类型:
Article
署名作者:
Lin, Yifu; Li, Wenling; Zhang, Bin; Du, Junping
署名单位:
Beihang University; Beijing University of Posts & Telecommunications; Beijing University of Posts & Telecommunications
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2024.3492329
发表日期:
2025
页码:
2738-2745
关键词:
VECTORS CONVERGENCE Convex functions TOPOLOGY telecommunications Distributed algorithms cost function catalysts SYMBOLS STANDARDS Directed networks distributed optimization gradient tracking Nesterov smooth function
摘要:
This article explores the problem of distributed optimization for functions that are smooth and nonstrongly convex over directed networks. To address this issue, an improved distributed Nesterov gradient tracking (IDNGT) algorithm is proposed, which utilizes the adapt-then-combine rule and row-stochastic weights. A main novelty of the proposed algorithm is the introduction of a scale factor into the gradient tracking scheme to suppress the consensus error. By the estimate sequence approach, the dynamics of the error due to the unbalance of directed networks is analyzed and it is shown that a sublinear convergence rate can be achieved with a vanishing step size. Numerical results suggest that the performance of IDNGT is comparable to that of the centralized Nesterov gradient descent algorithm.