Whitening Convergence Rate of Coupling-Based Normalizing Flows

Abstract

Coupling-based normalizing flows (e.g. RealNVP) are a popular family of normalizing flow architectures that work surprisingly well in practice. This calls for theoretical understanding. Existing work shows that such flows weakly converge to arbitrary data distributions. However, they make no statement about the stricter convergence criterion used in practice, the maximum likelihood loss. For the first time, we make a quantitative statement about this kind of convergence: We prove that all coupling-based normalizing flows perform whitening of the data distribution (i.e. diagonalize the covariance matrix) and derive corresponding convergence bounds that show a linear convergence rate in the depth of the flow. Numerical experiments demonstrate the implications of our theory and point at open questions.

Cite

Text

Draxler et al. "Whitening Convergence Rate of Coupling-Based Normalizing Flows." Neural Information Processing Systems, 2022.

Markdown

[Draxler et al. "Whitening Convergence Rate of Coupling-Based Normalizing Flows." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/draxler2022neurips-whitening/)

BibTeX

@inproceedings{draxler2022neurips-whitening,
  title     = {{Whitening Convergence Rate of Coupling-Based Normalizing Flows}},
  author    = {Draxler, Felix and Schnörr, Christoph and Köthe, Ullrich},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/draxler2022neurips-whitening/}
}