STATISTICAL INFERENCE FOR DECENTRALIZED FEDERATED LEARNING

成果类型:
Article
署名作者:
Gu, Jia; Chen, Song xi
署名单位:
Zhejiang University; Tsinghua University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/24-AOS2452
发表日期:
2024
页码:
2931-2955
关键词:
stochastic-approximation Gradient descent optimization CONVERGENCE
摘要:
This paper considers decentralized Federated Learning (FL) under heterogeneous distributions among distributed clients or data blocks for the Mestimation. The mean squared error and consensus error across the estimators from different clients via the decentralized stochastic gradient descent algorithm are derived. The asymptotic normality of the Polyak-Ruppert (PR) averaged estimator in the decentralized distributed setting is attained, which shows that its statistical efficiency comes at a cost as it is more restrictive on the number of clients than that in the distributed M-estimation. To overcome the restriction, a one-step estimator is proposed which permits a much larger number of clients while still achieving the same efficiency as the original PR-averaged estimator in the nondistributed setting. The confidence regions based on both the PR-averaged estimator and the proposed one-step estimator are constructed to facilitate statistical inference for decentralized FL.