Invertible Monotone Operators for Normalizing Flows

Abstract

Normalizing flows model probability distributions by learning invertible transformations that transfer a simple distribution into complex distributions. Since the architecture of ResNet-based normalizing flows is more flexible than that of coupling-based models, ResNet-based normalizing flows have been widely studied in recent years. Despite their architectural flexibility, it is well-known that the current ResNet-based models suffer from constrained Lipschitz constants. In this paper, we propose the monotone formulation to overcome the issue of the Lipschitz constants using monotone operators and provide an in-depth theoretical analysis. Furthermore, we construct an activation function called Concatenated Pila (CPila) to improve gradient flow. The resulting model, Monotone Flows, exhibits an excellent performance on multiple density estimation benchmarks (MNIST, CIFAR-10, ImageNet32, ImageNet64). Code is available at https://github.com/mlvlab/MonotoneFlows.

Cite

Text

Ahn et al. "Invertible Monotone Operators for Normalizing Flows." Neural Information Processing Systems, 2022.

Markdown

[Ahn et al. "Invertible Monotone Operators for Normalizing Flows." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/ahn2022neurips-invertible/)

BibTeX

@inproceedings{ahn2022neurips-invertible,
  title     = {{Invertible Monotone Operators for Normalizing Flows}},
  author    = {Ahn, Byeongkeun and Kim, Chiyoon and Hong, Youngjoon and Kim, Hyunwoo J},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/ahn2022neurips-invertible/}
}