Asymptotic Properties of S-AB Method With Diminishing Step-Size
成果类型:
Article
署名作者:
Zhao, Shengchao; Liu, Yongchao
署名单位:
Dalian University of Technology
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2023.3319155
发表日期:
2024
页码:
3222-3229
关键词:
convergence
optimization
Covariance matrices
Stochastic processes
linear programming
Gaussian distribution
Approximation algorithms
S-AB
Asymptotic Normality
convergence rate
distributed stochastic optimization (DSO)
摘要:
The popular AB /push-pull method for distributed optimization problem may unify much of the existing decentralized first-order methods based on gradient tracking technique. More recently, the stochastic gradient variant of AB /push-pull method ( S - AB ) has been proposed, which achieves the linear rate of converging to a neighborhood of the global minimizer when the step-size is constant. This article is devoted to the asymptotic properties of S - AB with diminishing step-size. Specifically, under the condition that each local objective is smooth and the global objective is strongly convex, we first present the boundedness of the iterates of S - AB and then show that the iterates converge to the global minimizer with the rate O(1/k) in the mean square sense. Furthermore, the asymptotic normality of Polyak-Ruppert averaged S - AB is obtained and applications on statistical inference are discussed. Finally, numerical tests are conducted to demonstrate the theoretic results.