Subset Selection with Shrinkage: Sparse Linear Modeling When the SNR Is Low

成果类型:
Article
署名作者:
Mazumder, Rahul; Radchenko, Peter; Dedieuc, Antoine
署名单位:
Massachusetts Institute of Technology (MIT); University of Sydney
刊物名称:
OPERATIONS RESEARCH
ISSN/ISSBN:
0030-364X
DOI:
10.1287/opre.2022.2276
发表日期:
2023
页码:
129-147
关键词:
VARIABLE SELECTION Optimal Rates regression Lasso persistence regularization optimization prediction RECOVERY bounds
摘要:
We study a seemingly unexpected and relatively less understood overfitting aspect of a fundamental tool in sparse linear modeling-best subset selection-which minimizes the residual sum of squares subject to a constraint on the number of nonzero coefficients. Whereas the best subset selection procedure is often perceived as the gold standard in sparse learning when the signal-to-noise ratio (SNR) is high, its predictive performance deteriorates when the SNR is low. In particular, it is outperformed by continuous shrinkage methods, such as ridge regression and the Lasso. We investigate the behavior of best subset selection in the high-noise regimes and propose an alternative approach based on a regularized version of the least- squares criterion. Our proposed estimators (a) mitigate, to a large extent, the poor predictive performance of best subset selection in the high-noise regimes; and (b) perform favorably, while generally delivering substantially sparser models, relative to the best predictive models available via ridge regression and the Lasso. We conduct an extensive theoretical analysis of the predictive properties of the proposed approach and provide justification for its superior predictive performance relative to best subset selection when the noise level is high. Our estimators can be expressed as solutions to mixed-integer secondorder conic optimization problems and, hence, are amenable to modern computational tools from mathematical optimization.
来源URL: