THOMPSON SAMPLING FOR ZERO-INFLATED COUNT OUTCOMES WITH AN APPLICATION TO THE DRINK LESS MOBILE HEALTH STUDY

成果类型:
Article
署名作者:
Liu, Xueqing; Deliu, Nina; Chakraborty, Tanujit; Bell, Lauren; Chakraborty, Bibhas
署名单位:
National University of Singapore; Sapienza University Rome; University of London; King's College London
刊物名称:
ANNALS OF APPLIED STATISTICS
ISSN/ISSBN:
1932-6157
DOI:
10.1214/25-AOAS2030
发表日期:
2025
页码:
1403-1425
关键词:
identification test audit likelihood regression
摘要:
Mobile health (mHealth) interventions often aim to improve distal outcomes, such as clinical conditions, by optimizing proximal outcomes through just-in-time adaptive interventions. Contextual bandits provide a suitable framework for customizing such interventions according to individual time-varying contexts. However, unique challenges, such as modeling count outcomes within bandit frameworks, have hindered the widespread application of contextual bandits to mHealth studies. The current work addresses this challenge by leveraging count data models into online decision-making approaches. Specifically, we combine four common offline count data models (Poisson, negative binomial, zero-inflated Poisson, and zero-inflated negative binomial regressions) with Thompson sampling, a popular contextual bandit algorithm. The proposed algorithms are motivated by and evaluated on a real dataset from the Drink Less trial, where they are shown to improve user engagement with the mHealth platform. The proposed methods are further evaluated on simulated data, achieving improvement in maximizing cumulative proximal outcomes over existing algorithms. Theoretical results on regret bounds are also derived. The countts R package provides an implementation of our approach.
来源URL: