Spike-and-Slab Probabilistic Backpropagation: When Smarter Approximations Make No Difference
Abstract
Probabilistic backpropagation is an approximate Bayesian inference method for deep neural networks, using a message-passing framework. These messages---which correspond to distributions arising as we propagate our input through a probabilistic neural network---are approximated as Gaussian. However, in practice, the exact distributions may be highly non-Gaussian. In this paper, we propose a more realistic approximation based on a spike-and-slab distribution. Unfortunately, in this case, better approximation of the messages does not translate to better downstream performance. We present results comparing the two schemes and discuss why we do not see a benefit from this spike-and-slab approach.
Cite
Text
Ott and Williamson. "Spike-and-Slab Probabilistic Backpropagation: When Smarter Approximations Make No Difference." NeurIPS 2022 Workshops: ICBINB, 2022.Markdown
[Ott and Williamson. "Spike-and-Slab Probabilistic Backpropagation: When Smarter Approximations Make No Difference." NeurIPS 2022 Workshops: ICBINB, 2022.](https://mlanthology.org/neuripsw/2022/ott2022neuripsw-spikeandslab/)BibTeX
@inproceedings{ott2022neuripsw-spikeandslab,
title = {{Spike-and-Slab Probabilistic Backpropagation: When Smarter Approximations Make No Difference}},
author = {Ott, Evan and Williamson, Sinead},
booktitle = {NeurIPS 2022 Workshops: ICBINB},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/ott2022neuripsw-spikeandslab/}
}