Quasi-Newton updating for large-scale distributed learning

成果类型:
Article
署名作者:
Wu, Shuyuan; Huang, Danyang; Wang, Hansheng
署名单位:
Shanghai University of Finance & Economics; Renmin University of China; Peking University
刊物名称:
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY
ISSN/ISSBN:
1369-7412
DOI:
10.1093/jrsssb/qkad059
发表日期:
2023
页码:
1326-1354
关键词:
convergence
摘要:
Distributed computing is critically important for modern statistical analysis. Herein, we develop a distributed quasi-Newton (DQN) framework with excellent statistical, computation, and communication efficiency. In the DQN method, no Hessian matrix inversion or communication is needed. This considerably reduces the computation and communication complexity of the proposed method. Notably, related existing methods only analyse numerical convergence and require a diverging number of iterations to converge. However, we investigate the statistical properties of the DQN method and theoretically demonstrate that the resulting estimator is statistically efficient over a small number of iterations under mild conditions. Extensive numerical analyses demonstrate the finite sample performance.
来源URL: