Off-Policy Confidence Interval Estimation with Confounded Markov Decision Process

成果类型:
Article
署名作者:
Shi, Chengchun; Zhu, Jin; Ye, Shen; Luo, Shikai; Zhu, Hongtu; Song, Rui
署名单位:
University of London; London School Economics & Political Science; Sun Yat Sen University; North Carolina State University; University of North Carolina; University of North Carolina Chapel Hill
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2022.2110878
发表日期:
2024
页码:
273-284
关键词:
dynamic treatment regimes
摘要:
This article is concerned with constructing a confidence interval for a target policy's value offline based on a pre-collected observational data in infinite horizon settings. Most of the existing works assume no unmeasured variables exist that confound the observed actions. This assumption, however, is likely to be violated in real applications such as healthcare and technological industries. In this article, we show that with some auxiliary variables that mediate the effect of actions on the system dynamics, the target policy's value is identifiable in a confounded Markov decision process. Based on this result, we develop an efficient off policy value estimator that is robust to potential model misspecification and provide rigorous uncertainty quantification. Our method is justified by theoretical results, simulated and real datasets obtained from ridesharing companies. A Python implementation of the proposed procedure is available at https://github.com/Mamba413/cope. Supplementary materials for this article are available online.