Robust Validation: Confident Predictions Even When Distributions Shift
成果类型:
Article
署名作者:
Cauchois, Maxime; Gupta, Suyash; Ali, Alnur; Duchi, John C.
署名单位:
Stanford University; Stanford University
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2023.2298037
发表日期:
2024
页码:
3033-3044
关键词:
sensitivity
摘要:
While the traditional viewpoint in machine learning and statistics assumes training and testing samples come from the same population, practice belies this fiction. One strategy-coming from robust statistics and optimization-is thus to build a model robust to distributional perturbations. In this article, we take a different approach to describe procedures for robust predictive inference, where a model provides uncertainty estimates on its predictions rather than point predictions. We present a method that produces prediction sets (almost exactly) giving the right coverage level for any test distribution in an f-divergence ball around the training population. The method, based on conformal inference, achieves (nearly) valid coverage in finite samples, under only the condition that the training data be exchangeable. An essential component of our methodology is to estimate the amount of expected future data shift and build robustness to it; we develop estimators and prove their consistency for protection and validity of uncertainty estimates under shifts. By experimenting on several large-scale benchmark datasets, including Recht et al.'s CIFAR-v4 and ImageNet-V2 datasets, we provide complementary empirical results that highlight the importance of robust predictive validity. Supplementary materials for this article are available online.