Recovering Best Statistical Guarantees via the Empirical Divergence-Based Distributionally Robust Optimization
成果类型:
Article
署名作者:
Lam, Henry
署名单位:
Columbia University
刊物名称:
OPERATIONS RESEARCH
ISSN/ISSBN:
0030-364X
DOI:
10.1287/opre.2018.1786
发表日期:
2019
页码:
1090-1105
关键词:
LIKELIHOOD
approximation
sensitivity
uncertainty
simulation
摘要:
We investigate the use of distributionally robust optimization (DRO) as a tractable tool to recover the asymptotic statistical guarantees provided by the central limit theorem, for maintaining the feasibility of an expected value constraint under ambiguous probability distributions. We show that using empirically defined Burg-entropy divergence balls to construct the DRO can attain such guarantees. These balls, however, are not reasoned from the standard data-driven DRO framework because, by themselves, they can have low or even zero probability of covering the true distribution. Rather, their superior statistical performances are endowed by linking the resulting DRO with empirical likelihood and empirical processes. We show that the sizes of these balls can be optimally calibrated using chi(2)-process excursion. We conduct numerical experiments to support our theoretical findings.