GumBolt: Extending Gumbel Trick to Boltzmann Priors

Abstract

Boltzmann machines (BMs) are appealing candidates for powerful priors in variational autoencoders (VAEs), as they are capable of capturing nontrivial and multi-modal distributions over discrete variables. However, non-differentiability of the discrete units prohibits using the reparameterization trick, essential for low-noise back propagation. The Gumbel trick resolves this problem in a consistent way by relaxing the variables and distributions, but it is incompatible with BM priors. Here, we propose the GumBolt, a model that extends the Gumbel trick to BM priors in VAEs. GumBolt is significantly simpler than the recently proposed methods with BM prior and outperforms them by a considerable margin. It achieves state-of-the-art performance on permutation invariant MNIST and OMNIGLOT datasets in the scope of models with only discrete latent variables. Moreover, the performance can be further improved by allowing multi-sampled (importance-weighted) estimation of log-likelihood in training, which was not possible with previous models.

Cite

Text

Khoshaman and Amin. "GumBolt: Extending Gumbel Trick to Boltzmann Priors." Neural Information Processing Systems, 2018.

Markdown

[Khoshaman and Amin. "GumBolt: Extending Gumbel Trick to Boltzmann Priors." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/khoshaman2018neurips-gumbolt/)

BibTeX

@inproceedings{khoshaman2018neurips-gumbolt,
  title     = {{GumBolt: Extending Gumbel Trick to Boltzmann Priors}},
  author    = {Khoshaman, Amir H and Amin, Mohammad},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {4061-4070},
  url       = {https://mlanthology.org/neurips/2018/khoshaman2018neurips-gumbolt/}
}