General Bayesian updating and the loss-likelihood bootstrap
成果类型:
Article
署名作者:
Lyddon, S. P.; Holmes, C. C.; Walker, S. G.
署名单位:
University of Oxford; University of Texas System; University of Texas Austin
刊物名称:
BIOMETRIKA
ISSN/ISSBN:
0006-3444
DOI:
10.1093/biomet/asz006
发表日期:
2019
页码:
465478
关键词:
information
CLASSIFICATION
inference
models
robust
RISK
摘要:
In this paper we revisit the weighted likelihood bootstrap, a method that generates samples from an approximate Bayesian posterior of a parametric model. We show that the same method can be derived, without approximation, under a Bayesian nonparametric model with the parameter of interest defined through minimizing an expected negative loglikelihood under an unknown sampling distribution. This interpretation enables us to extend the weighted likelihood bootstrap to posterior sampling for parameters minimizing an expected loss. We call this method the loss-likelihood bootstrap, and we make a connection between it and general Bayesian updating, which is a way of updating prior belief distributions that does not need the construction of a global probability model, yet requires the calibration of two forms of loss function. The loss-likelihood bootstrap is used to calibrate the general Bayesian posterior by matching asymptotic Fisher information. We demonstrate the proposed method on a number of examples.
来源URL: