Learn From Thy Neighbor: Parallel-Chain and Regional Adaptive MCMC

成果类型:
Article
署名作者:
Craiu, Radu V.; Rosenthal, Jeffrey; Yang, Chao
署名单位:
University of Toronto
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1198/jasa.2009.tm08393
发表日期:
2009
页码:
1454-1466
关键词:
monte-carlo CONVERGENCE algorithm ergodicity
摘要:
Starting with the seminal paper of Haario, Saksman, and Tamminen (Haario, Saksman, and Tamminen 2001), a substantial amount of work has been done to validate adaptive Markov chain Monte Carlo algorithms. In this paper we focus on two practical aspects of adaptive Metropolis samplers. First, we draw attention to the deficient performance of standard adaptation when the target distribution is multimodal. We propose a parallel chain adaptation strategy that incorporates multiple Markov chains which are run in parallel. Second, we note that the current adaptive MCMC paradigm implicitly assumes that the adaptation is uniformly efficient on all regions of the state space. However, in many practical instances, different optimal kernels are needed in different regions of the state space. We propose here a regional adaptation algorithm in which we account for possible errors made in defining the adaptation regions. This corresponds to the more realistic case in which one does not know exactly the optimal regions for adaptation. The methods focus on the random walk Metropolis sampling algorithm but their scope is much wider. We provide theoretical justification for the two adaptive approaches using the existent theory build for adaptive Markov chain Monte Carlo. We illustrate the performance of the methods using simulations and analyze a mixture model for real data using an algorithm that combines the two approaches.