Batch Norm with Entropic Regularization Turns Deterministic Autoencoders into Generative Models

Abstract

The variational autoencoder is a well defined deep generative model that utilizes an encoder-decoder framework where an encoding neural network outputs a non-deterministic code for reconstructing an input. The encoder achieves this by sampling from a distribution for every input, instead of outputting a deterministic code per input. The great advantage of this process is that it allows the use of the network as a generative model for sampling from the data distribution beyond provided samples for training. We show in this work that utilizing batch normalization as a source for non-determinism suffices to turn deterministic autoencoders into generative models on par with variational ones, so long as we add a suitable entropic regularization to the training objective.

Cite

Text

Ghose et al. "Batch Norm with Entropic Regularization Turns Deterministic Autoencoders into Generative Models." Uncertainty in Artificial Intelligence, 2020.

Markdown

[Ghose et al. "Batch Norm with Entropic Regularization Turns Deterministic Autoencoders into Generative Models." Uncertainty in Artificial Intelligence, 2020.](https://mlanthology.org/uai/2020/ghose2020uai-batch/)

BibTeX

@inproceedings{ghose2020uai-batch,
  title     = {{Batch Norm with Entropic Regularization Turns Deterministic Autoencoders into Generative Models}},
  author    = {Ghose, Amur and Rashwan, Abdullah and Poupart, Pascal},
  booktitle = {Uncertainty in Artificial Intelligence},
  year      = {2020},
  pages     = {1079-1088},
  volume    = {124},
  url       = {https://mlanthology.org/uai/2020/ghose2020uai-batch/}
}