Slowly Varying Regression Under Sparsity
成果类型:
Article
署名作者:
Bertsimas, Dimitris; Digalakis Jr, Vassilis; Li, Michael Lingzhi; Lami, Omar Skali
署名单位:
Massachusetts Institute of Technology (MIT); Hautes Etudes Commerciales (HEC) Paris; Harvard University; McKinsey & Company
刊物名称:
OPERATIONS RESEARCH
ISSN/ISSBN:
0030-364X
DOI:
10.1287/opre.2022.0330
发表日期:
2025
关键词:
models
algorithm
selection
subset
摘要:
We introduce the framework of slowly varying regression under sparsity, which allows sparse regression models to vary slowly and sparsely. We formulate the problem of parameter estimation as a mixed -integer optimization problem and demonstrate that it can be reformulated exactly as a binary convex optimization problem through a novel relaxation. The relaxation utilizes a new equality on Moore -Penrose inverses that convexifies the nonconvex objective function while coinciding with the original objective on all feasible binary points. This allows us to solve the problem significantly more efficiently and to provable optimality using a cutting plane-type algorithm. We develop a highly optimized implementation of such algorithm, which substantially improves upon the asymptotic computational complexity of a straightforward implementation. We further develop a fast heuristic method that is guaranteed to produce a feasible solution and, as we empirically illustrate, generates high -quality warm -start solutions for the binary optimization problem. To tune the framework's hyperparameters, we propose a practical procedure relying on binary search that, under certain assumptions, is guaranteed to recover the true model parameters. We show, on both synthetic and real -world data sets, that the resulting algorithm outperforms competing formulations in comparable times across a variety of metrics, including estimation accuracy, predictive power, and computational time, and is highly scalable, enabling us to train models with tens of thousands of parameters.