DEEP NEURAL NETWORKS FOR NONPARAMETRIC INTERACTION MODELS WITH DIVERGING DIMENSION
成果类型:
Article
署名作者:
Bhattacharya, Sohom; Fan, Jianqing; Mukherjee, Debarghya
署名单位:
State University System of Florida; University of Florida; Princeton University; Boston University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/24-AOS2442
发表日期:
2024
页码:
2738-2766
关键词:
minimax-optimal rates
optimal approximation
ADDITIVE REGRESSION
least-squares
CONVERGENCE
bounds
selection
smooth
Lasso
摘要:
Deep neural networks have achieved tremendous success due to their representation power and adaptation to low-dimensional structures. Their potential for estimating structured regression functions has been recently established in the literature. However, most of the studies require the input dimension to be fixed, and consequently, they ignore the effect of dimension on the rate of convergence and hamper their applications to modern big data with high dimensionality. In this paper, we bridge this gap by analyzing a k-way nonparametric interaction model in both growing dimension scenarios (d grows with n but at a slower rate) and in high dimension (d greater than or similar to n). In the latter case, sparsity assumptions and associated regularization are required to obtain optimal convergence rates. A new challenge in diverging dimension setting is in calculation mean-square error; the covariance terms among estimated additive components are an order of magnitude larger than those of the variances and can deteriorate statistical properties without proper care. We introduce a critical debiasing technique to amend the problem. We show that under certain standard assumptions, debiased deep neural networks achieve a minimax optimal rate both in terms of (n, d). Our proof techniques rely crucially on a novel debiasing technique that makes the covariances of additive components negligible in the mean-square error calculation. In addition, we establish the matching lower bounds.