First-Order Newton-Type Estimator for Distributed Estimation and Inference

成果类型:
Article
署名作者:
Chen, Xi; Liu, Weidong; Zhang, Yichen
署名单位:
New York University; Shanghai Jiao Tong University; Purdue University System; Purdue University
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2021.1891925
发表日期:
2022
页码:
1858-1874
关键词:
摘要:
This article studies distributed estimation and inference for a general statistical problem with a convex loss that could be nondifferentiable. For the purpose of efficient computation, we restrict ourselves to stochastic first-order optimization, which enjoys low per-iteration complexity. To motivate the proposed method, we first investigate the theoretical properties of a straightforward divide-and-conquer stochastic gradient descent approach. Our theory shows that there is a restriction on the number of machines and this restriction becomes more stringent when the dimension p is large. To overcome this limitation, this article proposes a new multi-round distributed estimation procedure that approximates the Newton step only using stochastic subgradient. The key component in our method is the proposal of a computationally efficient estimator of Sigma(-1)w, where Sigma is the population Hessian matrix and w is any given vector. Instead of estimating Sigma (or Sigma(-1)) that usually requires the second-order differentiability of the loss, the proposed first-order Newton-type estimator (FONE) directly estimates the vector of interest Sigma(-1)w as a whole and is applicable to nondifferentiable losses. Our estimator also facilitates the inference for the empirical risk minimizer. It turns out that the key term in the limiting covariance has the form of Sigma(-1)w, which can be estimated by FONE.