Asymptotics of Alpha-Divergence Variational Inference Algorithms with Exponential Families
Abstract
Recent works in Variational Inference have examined alternative criteria to the commonly used exclusive Kullback-Leibler divergence. Encouraging empirical results have been obtained with the family of alpha-divergences, but few works have focused on the asymptotic properties of the proposed algorithms, especially as the number of iterations goes to infinity. In this paper, we study a procedure that ensures a monotonic decrease in the alpha-divergence. We provide sufficient conditions to guarantee its convergence to a local minimizer of the alpha-divergence at a geometric rate when the variational family belongs to the class of exponential models. The sample-based version of this ideal procedure involves biased gradient estimators, thus hindering any theoretical study. We propose an alternative unbiased algorithm, we prove its almost sure convergence to a local minimizer of the alpha-divergence, and a law of the iterated logarithm. Our results are exemplified with toy and real-data experiments.
Cite
Text
Bertholom et al. "Asymptotics of Alpha-Divergence Variational Inference Algorithms with Exponential Families." Neural Information Processing Systems, 2024. doi:10.52202/079017-3370Markdown
[Bertholom et al. "Asymptotics of Alpha-Divergence Variational Inference Algorithms with Exponential Families." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/bertholom2024neurips-asymptotics/) doi:10.52202/079017-3370BibTeX
@inproceedings{bertholom2024neurips-asymptotics,
title = {{Asymptotics of Alpha-Divergence Variational Inference Algorithms with Exponential Families}},
author = {Bertholom, François and Douc, Randal and Roueff, François},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-3370},
url = {https://mlanthology.org/neurips/2024/bertholom2024neurips-asymptotics/}
}