Improved minimax predictive densities under Kullback-Leibler loss

成果类型:
Article
署名作者:
George, Edward I.; Liang, Feng; Xu, Xinyi
署名单位:
University of Pennsylvania; Duke University; University System of Ohio; Ohio State University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/009053606000000155
发表日期:
2006
页码:
78-91
关键词:
estimators fit
摘要:
Let X vertical bar mu similar to N-p(mu, v(x)I) and Y vertical bar mu - Np(mu, v(y)l) be independent p-dimensional multivariate normal vectors with common unknown mean mu. Based on only observing X = x, we consider the problem of obtaining a predictive density (p) over cap (y vertical bar x) for Y that is close to p(y vertical bar mu) as measured by expected Kullback-Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density (p) over cap (U)(y vertical bar x) under the uniform prior pi(U)(mu) equivalent to 1, which is best invariant and minimax. We show that any Bayes predictive density will be minimax if it is obtained by a prior yielding a marginal that is superharmonic or whose square root is superharmonic. This yields wide classes of minimax procedures that dominate (p) over cap (U)(y vertical bar x), including Bayes predictive densities under superharmonic priors. Fundamental similarities and differences with the parallel theory of estimating a multivariate normal mean under quadratic loss are described.