Semi-Implicit Variational Inference

Abstract

Semi-implicit variational inference (SIVI) is introduced to expand the commonly used analytic variational distribution family, by mixing the variational parameter with a flexible distribution. This mixing distribution can assume any density function, explicit or not, as long as independent random samples can be generated via reparameterization. Not only does SIVI expand the variational family to incorporate highly flexible variational distributions, including implicit ones that have no analytic density functions, but also sandwiches the evidence lower bound (ELBO) between a lower bound and an upper bound, and further derives an asymptotically exact surrogate ELBO that is amenable to optimization via stochastic gradient ascent. With a substantially expanded variational family and a novel optimization algorithm, SIVI is shown to closely match the accuracy of MCMC in inferring the posterior in a variety of Bayesian inference tasks.

Cite

Text

Yin and Zhou. "Semi-Implicit Variational Inference." International Conference on Machine Learning, 2018.

Markdown

[Yin and Zhou. "Semi-Implicit Variational Inference." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/yin2018icml-semiimplicit/)

BibTeX

@inproceedings{yin2018icml-semiimplicit,
  title     = {{Semi-Implicit Variational Inference}},
  author    = {Yin, Mingzhang and Zhou, Mingyuan},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {5660-5669},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/yin2018icml-semiimplicit/}
}