Tuning-Free Heterogeneous Inference in Massive Networks

成果类型:
Article
署名作者:
Ren, Zhao; Kang, Yongjian; Fan, Yingying; Lv, Jinchi
署名单位:
Pennsylvania Commonwealth System of Higher Education (PCSHE); University of Pittsburgh; University of Southern California
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2018.1537920
发表日期:
2019
页码:
1908-1925
关键词:
inverse covariance estimation Precision Matrix Estimation False Discovery Rate Lasso selection MODEL regression sparsity BENEFIT
摘要:
Heterogeneity is often natural in many contemporary applications involving massive data. While posing new challenges to effective learning, it can play a crucial role in powering meaningful scientific discoveries through the integration of information among subpopulations of interest. In this article, we exploit multiple networks with Gaussian graphs to encode the connectivity patterns of a large number of features on the subpopulations. To uncover the underlying sparsity structures across subpopulations, we suggest a framework of large-scale tuning-free heterogeneous inference, where the number of networks is allowed to diverge. In particular, two new tests, the chi-based and the linear functional-based tests, are introduced and their asymptotic null distributions are established. Under mild regularity conditions, we establish that both tests are optimal in achieving the testable region boundary and the sample size requirement for the latter test is minimal. Both theoretical guarantees and the tuning-free property stem from efficient multiple-network estimation by our newly suggested heterogeneous group square-root Lasso for high-dimensional multi-response regression with heterogeneous noises. To solve this convex program, we further introduce a scalable algorithm that enjoys provable convergence to the global optimum. Both computational and theoretical advantages are elucidated through simulation and real data examples. for this article are available online.