Asynchronous Gradient Push
成果类型:
Article
署名作者:
Assran, Mahmoud S.; Rabbat, Michael G.
署名单位:
Facebook Inc; McGill University
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2020.2981035
发表日期:
2021
页码:
168-183
关键词:
convergence
DELAYS
optimization
computational modeling
Protocols
Stochastic processes
Directed graphs
Asynchronous iterative methods
Convex Optimization
directed graph
distributed optimization
摘要:
We consider a multiagent framework for distributed optimization where each agent has access to a local smooth strongly convex function, and the collective goal is to achieve consensus on the parameters that minimize the sum of the agents' local functions. We propose an algorithm wherein each agent operates asynchronously and independently of the other agents. When the local functions are strongly convex with Lipschitz-continuous gradients, we show that the iterates at each agent converge to a neighborhood of the global minimum, where the neighborhood size depends on the degree of asynchrony in the multiagent network. When the agents work at the same rate, convergence to the global minimizer is achieved. Numerical experiments demonstrate that asynchronous gradient push can minimize the global objective faster than the state-of-the-art synchronous first-order methods, is more robust to failing or stalling agents, and scales better with the network size.