MINIMAX-OPTIMAL NONPARAMETRIC REGRESSION IN HIGH DIMENSIONS
成果类型:
Article
署名作者:
Yang, Yun; Tokdar, Surya T.
署名单位:
University of California System; University of California Berkeley; Duke University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/14-AOS1289
发表日期:
2015
页码:
652-674
关键词:
posterior distributions
sparsity recovery
Gaussian process
rates
CONVERGENCE
selection
Lasso
Consistency
摘要:
Minimax L-2 risks for high-dimensional nonparametric regression are derived under two sparsity assumptions: (1) the true regression surface is a sparse function that depends only on d = O(log n) important predictors among a list of p predictors, with log p = o(n); (2) the true regression surface depends on O(n) predictors but is an additive function where each additive component is sparse but may contain two or more interacting predictors and may have a smoothness level different from other components. For either modeling assumption, a practicable extension of the widely used Bayesian Gaussian process regression method is shown to adaptively attain the optimal minimax rate (up to log n terms) asymptotically as both n, p --> infinity with log p = o(n).
来源URL: