Emerging Convolutions for Generative Normalizing Flows

Abstract

Generative flows are attractive because they admit exact likelihood optimization and efficient image synthesis. Recently, Kingma & Dhariwal (2018) demonstrated with Glow that generative flows are capable of generating high quality images. We generalize the 1 {\texttimes} 1 convolutions proposed in Glow to invertible d {\texttimes} d convolutions, which are more flexible since they operate on both channel and spatial axes. We propose two methods to produce invertible convolutions, that have receptive fields identical to standard convolutions: Emerging convolutions are obtained by chaining specific autoregressive convolutions, and periodic convolutions are decoupled in the frequency domain. Our experiments show that the flexibility of d {\texttimes} d convolutions significantly improves the performance of generative flow models on galaxy images, CIFAR10 and ImageNet.

Cite

Text

Hoogeboom et al. "Emerging Convolutions for Generative Normalizing Flows." International Conference on Machine Learning, 2019.

Markdown

[Hoogeboom et al. "Emerging Convolutions for Generative Normalizing Flows." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/hoogeboom2019icml-emerging/)

BibTeX

@inproceedings{hoogeboom2019icml-emerging,
  title     = {{Emerging Convolutions for Generative Normalizing Flows}},
  author    = {Hoogeboom, Emiel and Van Den Berg, Rianne and Welling, Max},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {2771-2780},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/hoogeboom2019icml-emerging/}
}