Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space
Abstract
Variational inference (VI) seeks to approximate a target distribution $\pi$ by an element of a tractable family of distributions. Of key interest in statistics and machine learning is Gaussian VI, which approximates $\pi$ by minimizing the Kullback-Leibler (KL) divergence to $\pi$ over the space of Gaussians. In this work, we develop the (Stochastic) Forward-Backward Gaussian Variational Inference (FB-GVI) algorithm to solve Gaussian VI. Our approach exploits the composite structure of the KL divergence, which can be written as the sum of a smooth term (the potential) and a non-smooth term (the entropy) over the Bures-Wasserstein (BW) space of Gaussians endowed with the Wasserstein distance. For our proposed algorithm, we obtain state-of-the-art convergence guarantees when $\pi$ is log-smooth and log-concave, as well as the first convergence guarantees to first-order stationary solutions when $\pi$ is only log-smooth.
Cite
Text
Diao et al. "Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space." International Conference on Machine Learning, 2023.Markdown
[Diao et al. "Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/diao2023icml-forwardbackward/)BibTeX
@inproceedings{diao2023icml-forwardbackward,
title = {{Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space}},
author = {Diao, Michael Ziyang and Balasubramanian, Krishna and Chewi, Sinho and Salim, Adil},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {7960-7991},
volume = {202},
url = {https://mlanthology.org/icml/2023/diao2023icml-forwardbackward/}
}