REGULARIZATION AND THE SMALL-BALL METHOD I: SPARSE RECOVERY

成果类型:
Article
署名作者:
Lecue, Guillaume; Mendelson, Shahar
署名单位:
Institut Polytechnique de Paris; ENSAE Paris; Centre National de la Recherche Scientifique (CNRS); Universite Paris Saclay; Technion Israel Institute of Technology; Australian National University; Institut Polytechnique de Paris; ENSAE Paris
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/17-AOS1562
发表日期:
2018
页码:
611-641
关键词:
VARIABLE SELECTION Lasso bounds slope
摘要:
We obtain bounds on estimation error rates for regularization procedures of the form (f) over cap is an element of argmin(f is an element of F)(1/N Sigma(N)(i=1) (Yi - f (X-i))(2) + lambda Psi(f)) when Psi is a norm and F is convex. Our approach gives a common framework that may be used in the analysis of learning problems and regularization problems alike. In particular, it sheds some light on the role various notions of sparsity have in regularization and on their connection with the size of subdifferentials of Psi in a neighborhood of the true minimizer. As proof of concept we extend the known estimates for the LASSO, SLOPE and trace norm regularization.