PIVOTAL ESTIMATION VIA SQUARE-ROOT LASSO IN NONPARAMETRIC REGRESSION
成果类型:
Article
署名作者:
Belloni, Alexandre; Chernozhukov, Victor; Wang, Lie
署名单位:
Duke University; Massachusetts Institute of Technology (MIT); Massachusetts Institute of Technology (MIT)
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/14-AOS1204
发表日期:
2014
页码:
757-788
关键词:
model-selection
least-squares
RECOVERY
sparsity
inequalities
摘要:
We propose a self-tuning root Lasso method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly behaved designs, for example, perfectly collinear regressors, and generates sharp bounds even in extreme cases, such as the infinite variance case and the noiseless case, in contrast to Lasso. We establish various nonasymptotic bounds for root Lasso including prediction norm rate and sparsity. Our analysis is based on new impact factors that are tailored for bounding prediction norm. In order to cover heteroscedastic non-Gaussian noise, we rely on moderate deviation theory for self-normalized sums to achieve Gaussian-like results under weak conditions. Moreover, we derive bounds on the performance of ordinary least square (ols) applied to the model selected by root Lasso accounting for possible misspecification of the selected model. Under mild conditions, the rate of convergence of ols post root Lasso is as good as root Lasso's rate. As an application, we consider the use of root Lasso and ols post root Lasso as estimators of nuisance parameters in a generic semiparametric problem (nonlinear moment condition or Z-problem), resulting in a construction of root n-consistent and asymptotically normal estimators of the main parameters.