l1-regularized linear regression: persistence and oracle inequalities
成果类型:
Article
署名作者:
Bartlett, Peter L.; Mendelson, Shahar; Neeman, Joseph
署名单位:
University of California System; University of California Berkeley; University of California System; University of California Berkeley; Technion Israel Institute of Technology
刊物名称:
PROBABILITY THEORY AND RELATED FIELDS
ISSN/ISSBN:
0178-8051
DOI:
10.1007/s00440-011-0367-2
发表日期:
2012
页码:
193-224
关键词:
selection
Lasso
sparsity
REPRESENTATIONS
RECOVERY
rates
摘要:
We study the predictive performance of a (1)-regularized linear regression in a model-free setting, including the case where the number of covariates is substantially larger than the sample size. We introduce a new analysis method that avoids the boundedness problems that typically arise in model-free empirical minimization. Our technique provides an answer to a conjecture of Greenshtein and Ritov (Bernoulli 10(6):971-988, 2004) regarding the persistence rate for linear regression and allows us to prove an oracle inequality for the error of the regularized minimizer. It also demonstrates that empirical risk minimization gives optimal rates (up to log factors) of convex aggregation of a set of estimators of a regression function.