Approximate Bayesian Inference with Stein Functional Variational Gradient Descent
Abstract
We propose a general-purpose variational algorithm that forms a natural analogue of Stein variational gradient descent (SVGD) in function space. While SVGD successively updates a set of particles to match a target density, the method introduced here of Stein functional variational gradient descent (SFVGD) updates a set of particle functions to match a target stochastic process (SP). The update step is found by minimizing the functional derivative of the Kullback-Leibler divergence between SPs. SFVGD can either be used to train Bayesian neural networks (BNNs) or for ensemble gradient boosting. We show the efficacy of training BNNs with SFVGD on various real-world datasets.
Cite
Text
Pielok et al. "Approximate Bayesian Inference with Stein Functional Variational Gradient Descent." International Conference on Learning Representations, 2023.Markdown
[Pielok et al. "Approximate Bayesian Inference with Stein Functional Variational Gradient Descent." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/pielok2023iclr-approximate/)BibTeX
@inproceedings{pielok2023iclr-approximate,
title = {{Approximate Bayesian Inference with Stein Functional Variational Gradient Descent}},
author = {Pielok, Tobias and Bischl, Bernd and Rügamer, David},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/pielok2023iclr-approximate/}
}