DISCREPANCY BOUNDS FOR UNIFORMLY ERGODIC MARKOV CHAIN QUASI-MONTE CARLO

成果类型:
Article
署名作者:
Dick, Josef; Rudolf, Daniel; Zhu, Houying
署名单位:
University of New South Wales Sydney; Friedrich Schiller University of Jena
刊物名称:
ANNALS OF APPLIED PROBABILITY
ISSN/ISSBN:
1050-5164
DOI:
10.1214/16-AAP1173
发表日期:
2016
页码:
3178-3205
关键词:
convergence sphere INEQUALITY numbers points rates
摘要:
Markov chains can be used to generate samples whose distribution approximates a given target distribution. The quality of the samples of such Markov chains can be measured by the discrepancy between the empirical distribution of the samples and the target distribution. We prove upper bounds on this discrepancy under the assumption that the Markov chain is uniformly ergodic and the driver sequence is deterministic rather than independent U(0, 1) random variables. In particular, we show the existence of driver sequences for which the discrepancy of the Markov chain from the target distribution with respect to certain test sets converges with (almost) the usual Monte Carlo rate of n(-1/2).
来源URL: