Asymptotically minimax Bayes predictive densities

成果类型:
Article
署名作者:
Aslan, Mihaela
署名单位:
US Department of Veterans Affairs; Veterans Health Administration (VHA); VA Connecticut Healthcare System; Yale University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/009053606000000885
发表日期:
2006
页码:
2921-2938
关键词:
DISTRIBUTIONS
摘要:
Given a random sample from a distribution with density function that depends on an unknown parameter 0, we are interested in accurately estimating the true parametric density function at a future observation from the same distribution. The asymptotic risk of Bayes predictive density estimates with Kullback-Leibler loss function D(f(theta)parallel to(f) over cap) = integral (f) over cap (theta) log (f(theta)/(f) over cap) is used to examine various ways of choosing prior distributions; the principal type of choice studied is minimax. We seek asymptotically least favorable predictive densities for which the corresponding asymptotic risk is minimax. A result resembling Stein's paradox for estimating normal means by maximum likelihood holds for the uniform prior in the multivariate location family case: when the dimensionality of the model is at least three, the Jeffreys prior is minimax, though inadmissible. The Jeffreys prior is both admissible and minimax for one- and two-dimensional location problems.