Finite-Sample Guarantees for Wasserstein Distributionally Robust Optimization: Breaking the Curse of Dimensionality
成果类型:
Article
署名作者:
Gao, Rui
署名单位:
University of Texas System; University of Texas Austin
刊物名称:
OPERATIONS RESEARCH
ISSN/ISSBN:
0030-364X
DOI:
10.1287/opre.2022.2326
发表日期:
2023
关键词:
distance
inequalities
摘要:
Wasserstein distributionally robust optimization (DRO) aims to find robust and generalizable solutions by hedging against data perturbations in Wasserstein distance. Despite its recent empirical success in operations research and machine learning, existing performance guarantees for generic loss functions are either overly conservative because of the curse of dimensionality or plausible only in large sample asymptotics. In this paper, we develop a nonasymptotic framework for analyzing the out-of-sample performance for Was-serstein robust learning and the generalization bound for its related Lipschitz and gradient regularization problems. To the best of our knowledge, this gives the first finite-sample guarantee for generic Wasserstein DRO problems without suffering from the curse of dimensionality. Our results highlight that Wasserstein DRO, with a properly chosen radius, balances between the empirical mean of the loss and the variation of the loss, measured by the Lipschitz norm or the gradient norm of the loss. Our analysis is based on two novel methodological developments that are of independent interest: (1) a new concentration inequality controlling the decay rate of large deviation probabilities by the variation of the loss and (2) a localized Rademacher complexity theory based on the variation of the loss.
来源URL: