MINIMAX OPTIMAL RATES OF ESTIMATION IN HIGH DIMENSIONAL ADDITIVE MODELS
成果类型:
Article
署名作者:
Yuan, Ming; Zhou, Ding-Xuan
署名单位:
University of Wisconsin System; University of Wisconsin Madison; City University of Hong Kong
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/15-AOS1422
发表日期:
2016
页码:
2564-2593
关键词:
component selection
variable selection
DANTZIG SELECTOR
regression
Lasso
CONVERGENCE
sparsity
摘要:
We establish minimax optimal rates of convergence for estimation in a high dimensional additive model assuming that it is approximately sparse. Our results reveal a behavior universal to this class of high dimensional problems. In the sparse regime when the components are sufficiently smooth or the dimensionality is sufficiently large, the optimal rates are identical to those for high dimensional linear regression and, therefore, there is no additional cost to entertain a nonparametric model. Otherwise, in the so-called smooth regime, the rates coincide with the optimal rates for estimating a univariate function and, therefore, they are immune to the curse of dimensionality.