TIGHT CONDITIONS FOR CONSISTENCY OF VARIABLE SELECTION IN THE CONTEXT OF HIGH DIMENSIONALITY
成果类型:
Article
署名作者:
Comminges, Laetitia; Dalalyan, Arnak S.
署名单位:
Universite Gustave-Eiffel; Institut Polytechnique de Paris; Ecole Nationale des Ponts et Chaussees; Institut Polytechnique de Paris; ENSAE Paris
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/12-AOS1046
发表日期:
2012
页码:
2667-2696
关键词:
nonconcave penalized likelihood
Nonparametric Regression
asymptotic equivalence
Adaptive estimation
SPARSE
MULTIVARIATE
reduction
RECOVERY
摘要:
We address the issue of variable selection in the regression model with very high ambient dimension, that is, when the number of variables is very large. The main focus is on the situation where the number of relevant variables, called intrinsic dimension, is much smaller than the ambient dimension d. Without assuming any parametric form of the underlying regression function, we get tight conditions making it possible to consistently estimate the set of relevant variables. These conditions relate the intrinsic dimension to the ambient dimension and to the sample size. The procedure that is provably consistent under these tight conditions is based on comparing quadratic functionals of the empirical Fourier coefficients with appropriately chosen threshold values. The asymptotic analysis reveals the presence of two quite different regimes. The first regime is when the intrinsic dimension is fixed. In this case the situation in nonparametric regression is the same as in linear regression, that is, consistent variable selection is possible if and only if log d is small compared to the sample size n. The picture is different in the second regime, that is, when the number of relevant variables denoted by s tends to infinity as n -> infinity. Then we prove that consistent variable selection in nonparametric set- up is possible only if s + log log d is small compared to log n. We apply these results to derive minimax separation rates for the problem of variable selection.