Monte Carlo Approximation of Bayes Factors via Mixing With Surrogate Distributions

成果类型:
Article
署名作者:
Dai, Chenguang; Liu, Jun S.
署名单位:
Harvard University
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2020.1811100
发表日期:
2022
页码:
765-780
关键词:
wang-landau algorithm marginal likelihood normalizing constants CONVERGENCE inference efficient models
摘要:
By mixing the target posterior distribution with a surrogate distribution, of which the normalizing constant is tractable, we propose a method for estimating the marginal likelihood using the Wang-Landau algorithm. We show that a faster convergence of the proposed method can be achieved via the momentum acceleration. Two implementation strategies are detailed: (i) facilitating global jumps between the posterior and surrogate distributions via the multiple-try Metropolis (MTM); (ii) constructing the surrogate via the variational approximation. When a surrogate is difficult to come by, we describe a new jumping mechanism for general reversible jump Markov chain Monte Carlo algorithms, which combines the MTM and a directional sampling algorithm. We illustrate the proposed methods on several statistical models, including the log-Gaussian Cox process, the Bayesian Lasso, the logistic regression, and the g-prior Bayesian variable selection.for this article are available online.