α-VARIATIONAL INFERENCE WITH STATISTICAL GUARANTEES

成果类型:
Article
署名作者:
Yang, Yun; Pati, Debdeep; Bhattacharya, Anirban
署名单位:
University of Illinois System; University of Illinois Urbana-Champaign; Texas A&M University System; Texas A&M University College Station
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/19-AOS1827
发表日期:
2020
页码:
886-905
关键词:
VARIABLE SELECTION convergence-rates posterior regression FRAMEWORK
摘要:
We provide statistical guarantees for a family of variational approximations to Bayesian posterior distributions, called alpha-VB, which has close connections with variational approximations of tempered posteriors in the literature. The standard variational approximation is a special case of alpha-VB with alpha = 1. When alpha is an element of (0, 1], a novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the alpha-VB procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field variational approximation to (low)-high-dimensional Bayesian linear regression with spike and slab priors, Gaussian mixture models and latent Dirichlet allocation.