Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
成果类型:
Article
署名作者:
Conrad, Patrick R.; Marzouk, Youssef M.; Pillai, Natesh S.; Smith, Aaron
署名单位:
Massachusetts Institute of Technology (MIT); Harvard University; University of Ottawa
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2015.1096787
发表日期:
2016
页码:
1591-1607
关键词:
Bayesian Inverse Problems
monte-carlo
CONSTRUCTION
calibration
regression
reduction
DESIGN
摘要:
We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or of numerical models embedded therein. Our approach introduces local approximations of these models into the Metropolis Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. Previous efforts at integrating approximate models into inference typically sacrifice either the sampler's exactness or efficiency; our work seeks to address these limitations by exploiting useful convergence characteristics of local approximations. We prove the ergodicity of our approximate Markov chain, showing that it samples asymptotically from the exact posterior distribution of interest. We describe variations of the algorithm that employ either local polynomial approximations or local Gaussian process regressors. Our theoretical results reinforce the key observation underlying this article: when the likelihood has some local regularity, the number of model evaluations per Markov chain Monte Carlo (MCMC) step can be greatly reduced without biasing the Monte Carlo average. Numerical experiments demonstrate multiple order-of-magnitude reductions in the number of forward model evaluations used in representative ordinary differential equation (ODE) and partial differential equation (PDE) inference problems, with both synthetic and real data. Supplementary materials for this article are available online.