Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation

Abstract

We propose a new Bayesian Neural Net formulation that affords variational inference for which the evidence lower bound is analytically tractable subject to a tight approximation. We achieve this tractability by (i) decomposing ReLU nonlinearities into the product of an identity and a Heaviside step function, (ii) introducing a separate path that decomposes the neural net expectation from its variance. We demonstrate formally that introducing separate latent binary variables to the activations allows representing the neural network likelihood as a chain of linear operations. Performing variational inference on this construction enables a sampling-free computation of the evidence lower bound which is a more effective approximation than the widely applied Monte Carlo sampling and CLT related techniques. We evaluate the model on a range of regression and classification tasks against BNN inference alternatives, showing competitive or improved performance over the current state-of-the-art.

Cite

Text

Haußmann et al. "Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation." Uncertainty in Artificial Intelligence, 2019.

Markdown

[Haußmann et al. "Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation." Uncertainty in Artificial Intelligence, 2019.](https://mlanthology.org/uai/2019/haumann2019uai-samplingfree/)

BibTeX

@inproceedings{haumann2019uai-samplingfree,
  title     = {{Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation}},
  author    = {Haußmann, Manuel and Hamprecht, Fred A. and Kandemir, Melih},
  booktitle = {Uncertainty in Artificial Intelligence},
  year      = {2019},
  pages     = {563-573},
  volume    = {115},
  url       = {https://mlanthology.org/uai/2019/haumann2019uai-samplingfree/}
}