Variational Russian Roulette for Deep Bayesian Nonparametrics
Abstract
Bayesian nonparametric models provide a principled way to automatically adapt the complexity of a model to the amount of the data available, but computation in such models is difficult. Amortized variational approximations are appealing because of their computational efficiency, but current methods rely on a fixed finite truncation of the infinite model. This truncation level can be difficult to set, and also interacts poorly with amortized methods due to the over-pruning problem. Instead, we propose a new variational approximation, based on a method from statistical physics called Russian roulette sampling. This allows the variational distribution to adapt its complexity during inference, without relying on a fixed truncation level, and while still obtaining an unbiased estimate of the gradient of the original variational objective. We demonstrate this method on infinite sized variational auto-encoders using a Beta-Bernoulli (Indian buffet process) prior.
Cite
Text
Xu et al. "Variational Russian Roulette for Deep Bayesian Nonparametrics." International Conference on Machine Learning, 2019.Markdown
[Xu et al. "Variational Russian Roulette for Deep Bayesian Nonparametrics." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/xu2019icml-variational/)BibTeX
@inproceedings{xu2019icml-variational,
title = {{Variational Russian Roulette for Deep Bayesian Nonparametrics}},
author = {Xu, Kai and Srivastava, Akash and Sutton, Charles},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {6963-6972},
volume = {97},
url = {https://mlanthology.org/icml/2019/xu2019icml-variational/}
}