INFINITE-DIMENSIONAL GRADIENT-BASED DESCENT FOR ALPHA-DIVERGENCE MINIMISATION
成果类型:
Article
署名作者:
Daudel, Kamelia; Douc, Randal; Portier, Francois
署名单位:
IMT - Institut Mines-Telecom; Institut Polytechnique de Paris; Telecom Paris; IMT - Institut Mines-Telecom; Institut Polytechnique de Paris; Telecom SudParis
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/20-AOS2035
发表日期:
2021
页码:
2250-2270
关键词:
convergence
beta
摘要:
This paper introduces the (alpha, Gamma)-descent, an iterative algorithm which operates on measures and performs alpha-divergence minimisation in a Bayesian framework. This gradient-based procedure extends the commonly-used variational approximation by adding a prior on the variational parameters in the form of a measure. We prove that for a rich family of functions Gamma, this algorithm leads at each step to a systematic decrease in the alpha-divergence and derive convergence results. Our framework recovers the Entropic Mirror Descent algorithm and provides an alternative algorithm that we call the Power Descent. Moreover, in its stochastic formulation, the (alpha, Gamma)-descent allows to optimise the mixture weights of any given mixture model without any information on the underlying distribution of the variational parameters. This renders our method compatible with many choices of parameters updates and applicable to a wide range of Machine Learning tasks. We demonstrate empirically on both toy and real-world examples the benefit of using the Power Descent and going beyond the Entropic Mirror Descent framework, which fails as the dimension grows.