Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

Abstract

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. Our method iteratively transports a set of particles to match the target distribution, by applying a form of functional gradient descent that minimizes the KL divergence. Empirical studies are performed on various real world models and datasets, on which our method is competitive with existing state-of-the-art methods. The derivation of our method is based on a new theoretical result that connects the derivative of KL divergence under smooth transforms with Stein’s identity and a recently proposed kernelized Stein discrepancy, which is of independent interest.

Cite

Text

Liu and Wang. "Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm." Neural Information Processing Systems, 2016.

Markdown

[Liu and Wang. "Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/liu2016neurips-stein/)

BibTeX

@inproceedings{liu2016neurips-stein,
  title     = {{Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm}},
  author    = {Liu, Qiang and Wang, Dilin},
  booktitle = {Neural Information Processing Systems},
  year      = {2016},
  pages     = {2378-2386},
  url       = {https://mlanthology.org/neurips/2016/liu2016neurips-stein/}
}