Risk-Based Robust Statistical Learning by Stochastic Difference-of-Convex Value-Function Optimization
成果类型:
Article
署名作者:
Liu, Junyi; Pang, Jong-Shi
署名单位:
Tsinghua University; University of Southern California
刊物名称:
OPERATIONS RESEARCH
ISSN/ISSBN:
0030-364X
DOI:
10.1287/opre.2021.2248
发表日期:
2023
页码:
397-414
关键词:
value-at-risk
regression
摘要:
This paper proposes the use of a variant of the conditional value-at-risk (CVaR) risk measure, called the interval conditional value-at-risk (In-CVaR), for the treatment of outliers in statistical learning by excluding the risks associated with the left and right tails of the loss. The risk-based robust learning task is to minimize the In-CVaR risk measure of a random functional that is the composite of a piecewise affine loss function with a potentially nonsmooth difference-of-convex statistical learningmodel. With the optimization formula of CVaR, the objective function of the minimization problem is the difference of two convex functions each being the optimal objective value of a univariate convex stochastic program. An algorithm that combines sequential sampling and convexification is developed, and its subsequential almost-sure convergence to a critical point is established. Numerical experiments demonstrate the effectiveness of the In-CVaR-based estimator computed by the sampling-based algorithm for robust regression and classification. Overall, this research extends the traditional approaches for treating outliers by allowing nonsmooth and nonconvex statistical learning models, employing a population risk-based objective, and applying a sampling-based algorithm with the stationarity guarantee for solving the resulting nonconvex and nonsmooth stochastic program.
来源URL: