On the Universality of Volume-Preserving and Coupling-Based Normalizing Flows

Abstract

We present a novel theoretical framework for understanding the expressive power of normalizing flows. Despite their prevalence in scientific applications, a comprehensive understanding of flows remains elusive due to their restricted architectures. Existing theorems fall short as they require the use of arbitrarily ill-conditioned neural networks, limiting practical applicability. We propose a distributional universality theorem for well-conditioned coupling-based normalizing flows such as RealNVP. In addition, we show that volume-preserving normalizing flows are not universal, what distribution they learn instead, and how to fix their expressivity. Our results support the general wisdom that affine and related couplings are expressive and in general outperform volume-preserving flows, bridging a gap between empirical results and theoretical understanding.

Cite

Text

Draxler et al. "On the Universality of Volume-Preserving and Coupling-Based Normalizing Flows." International Conference on Machine Learning, 2024.

Markdown

[Draxler et al. "On the Universality of Volume-Preserving and Coupling-Based Normalizing Flows." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/draxler2024icml-universality/)

BibTeX

@inproceedings{draxler2024icml-universality,
  title     = {{On the Universality of Volume-Preserving and Coupling-Based Normalizing Flows}},
  author    = {Draxler, Felix and Wahl, Stefan and Schnoerr, Christoph and Koethe, Ullrich},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {11613-11641},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/draxler2024icml-universality/}
}