From robust tests to Bayes-like posterior distributions
成果类型:
Article
署名作者:
Baraud, Yannick
署名单位:
University of Luxembourg
刊物名称:
PROBABILITY THEORY AND RELATED FIELDS
ISSN/ISSBN:
0178-8051
DOI:
10.1007/s00440-023-01222-8
发表日期:
2024
页码:
159-234
关键词:
model selection
CONVERGENCE
estimators
bounds
rates
摘要:
In the Bayes paradigm, given a loss function and an n-sample, we present the construction of a new type of posterior distribution, that extends the classical Bayes one. The loss functions we have in mind are either those derived from the total variation and Hellinger distances or some L-j-ones for j > 1. We prove that, with a probability close to one, this new posterior distribution concentrates its mass in a neighbourhood (for the chosen loss function) of the law of the data, provided that this law belongs to the support of the prior or, at least, lies close enough to it. We therefore establish that the new posterior distribution enjoys some robustness properties with respect to a possible misspecification of the prior, or more precisely, its support. We also show that the posterior distribution is stable with respect to the equidistribution assumption we started from. Besides, when the model is regular and well-specified and one uses the squared Hellinger loss, we show that our credible regions possess, at least for n sufficiently large, the same ellipsoidal shapes and approximately the same sizes as those we would derive from the classical Bayesian posterior distribution by using the Bernstein-von Mises theorem. Then we use our Bayesian-like approach to solve the following problems. We first consider the estimation of a location parameter or both the location and scale parameters of a density in a nonparametric framework. Then we tackle the problem of estimating a density, with the squared Hellinger loss, in a high-dimensional parametric model under some sparsity conditions on the parameter. Importantly, the results established in this paper are nonasymptotic and provide, as much as possible, bounds with explicit constants.