A Computational Framework for Multivariate Convex Regression and Its Variants
成果类型:
Article
署名作者:
Mazumder, Rahul; Choudhury, Arkopal; Iyengar, Garud; Sen, Bodhisattva
署名单位:
Massachusetts Institute of Technology (MIT); University of North Carolina; University of North Carolina Chapel Hill; Columbia University; Columbia University
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2017.1407771
发表日期:
2019
页码:
318-331
关键词:
nonparametric-estimation
estimators
摘要:
We study the nonparametric least squares estimator (LSE) of a multivariate convex regression function. The LSE, given as the solution to a quadratic program with O(n(2)) linear constraints (n being the sample size), is difficult to compute for large problems. Exploiting problem specific structure, we propose a scalable algorithmic framework based on the augmented Lagrangian method to compute the LSE. We develop a novel approach to obtain smooth convex approximations to the fitted (piecewise affine) convex LSE and provide formal bounds on the quality of approximation. When the number of samples is not too large compared to the dimension of the predictor, we propose a regularization schemeLipschitz convex regressionwhere we constrain the norm of the subgradients, and study the rates of convergence of the obtained LSE. Our algorithmic framework is simple and flexible and can be easily adapted to handle variants: estimation of a nondecreasing/nonincreasing convex/concave (with or without a Lipschitz bound) function. We perform numerical studies illustrating the scalability of the proposed algorithmon some instances our proposal leads to more than a 10,000-fold improvement in runtime when compared to off-the-shelf interior point solvers for problems with n = 500.