MaCow: Masked Convolutional Generative Flow

Abstract

Flow-based generative models, conceptually attractive due to tractability of both the exact log-likelihood computation and latent-variable inference, and efficiency of both training and sampling, has led to a number of impressive empirical successes and spawned many advanced variants and theoretical investigations. Despite their computational efficiency, the density estimation performance of flow-based generative models significantly falls behind those of state-of-the-art autoregressive models. In this work, we introduce masked convolutional generative flow (MaCow), a simple yet effective architecture of generative flow using masked convolution. By restricting the local connectivity in a small kernel, MaCow enjoys the properties of fast and stable training, and efficient sampling, while achieving significant improvements over Glow for density estimation on standard image benchmarks, considerably narrowing the gap to autoregressive models.

Cite

Text

Ma et al. "MaCow: Masked Convolutional Generative Flow." Neural Information Processing Systems, 2019.

Markdown

[Ma et al. "MaCow: Masked Convolutional Generative Flow." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/ma2019neurips-macow/)

BibTeX

@inproceedings{ma2019neurips-macow,
  title     = {{MaCow: Masked Convolutional Generative Flow}},
  author    = {Ma, Xuezhe and Kong, Xiang and Zhang, Shanghang and Hovy, Eduard},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {5893-5902},
  url       = {https://mlanthology.org/neurips/2019/ma2019neurips-macow/}
}