SLOPE IS ADAPTIVE TO UNKNOWN SPARSITY AND ASYMPTOTICALLY MINIMAX

成果类型:
Article
署名作者:
Su, Weijie; Candes, Emmanuel
署名单位:
Stanford University; Stanford University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/15-AOS1397
发表日期:
2016
页码:
1038-1068
关键词:
false discovery rate variable selection model selection REGRESSION SHRINKAGE INFLATION CRITERION DANTZIG SELECTOR Lasso RECOVERY RISK polytopes
摘要:
We consider high-dimensional sparse regression problems in which we observe y = X beta + z, where X is an n x p design matrix and z is an n dimensional vector of independent Gaussian errors, each with variance sigma(2). Our focus is on the recently introduced SLOPE estimator [Ann. Appi. Stat. 9 (2015) 1103-1140], which regularizes the least-squares estimates with the rank-dependent penalty Sigma(1 <= i <= p) lambda(i)vertical bar(beta) over cap vertical bar((i)), where vertical bar(beta) over cap vertical bar((i)) is the ith largest magnitude of the fitted coefficients. Under Gaussian designs, where the entries of X are i.i.d. N(0, 1/n), we show that SLOPE, with weights lambda(i) just about equal to sigma center dot Phi(-1) (1 - iq/(2p)) [Phi(-1) (alpha) is the alpha th quantile of a standard normal and q is a fixed number in (0, 1)] achieves a squared error of estimation obeying sup P(vertical bar vertical bar(beta) over cap (SLOPE) - beta vertical bar vertical bar(2) > (1 + epsilon)2 sigma(2)k log(p/k)) -> 0 vertical bar vertical bar beta vertical bar vertical bar(0)<= k as the dimension p increases to infinity, and where s > 0 is an arbitrary small constant. This holds under a weak assumption on the l(0)-sparsity level, namely, k/p -> 0 and (k log p)/n -> 0, and is sharp in the sense that this is the best possible error any estimator can achieve. A remarkable feature is that SLOPE does not require any knowledge of the degree of sparsity, and yet automatically adapts to yield optimal total squared errors over a wide range of l(0)-sparsity classes. We are not aware of any other estimator with this property.
来源URL: