On the Expressivity of Bi-Lipschitz Normalizing Flows
Abstract
An invertible function is bi-Lipschitz if both the function and its inverse have bounded Lipschitz constants. Most state-of-the-art Normalizing Flows are bi-Lipschitz by design or by training to limit numerical errors (among other things). In this paper, we discuss the expressivity of bi-Lipschitz Normalizing Flows and identify several target distributions that are difficult to approximate using such models. Then, we characterize the expressivity of bi-Lipschitz Normalizing Flows by giving several lower bounds on the Total Variation distance between these particularly unfavorable distributions and their best possible approximation. Finally, we show how to use the bounds to adjust the training parameters, and discuss potential remedies.
Cite
Text
Verine et al. "On the Expressivity of Bi-Lipschitz Normalizing Flows." Proceedings of The 14th Asian Conference on Machine Learning, 2022.Markdown
[Verine et al. "On the Expressivity of Bi-Lipschitz Normalizing Flows." Proceedings of The 14th Asian Conference on Machine Learning, 2022.](https://mlanthology.org/acml/2022/verine2022acml-expressivity/)BibTeX
@inproceedings{verine2022acml-expressivity,
title = {{On the Expressivity of Bi-Lipschitz Normalizing Flows}},
author = {Verine, Alexandre and Negrevergne, Benjamin and Chevaleyre, Yann and Rossi, Fabrice},
booktitle = {Proceedings of The 14th Asian Conference on Machine Learning},
year = {2022},
pages = {1054-1069},
volume = {189},
url = {https://mlanthology.org/acml/2022/verine2022acml-expressivity/}
}