Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows
Abstract
Training normalizing flow generative models can be challenging due to the need to calculate computationally expensive determinants of Jacobians. This paper studies the likelihood-free training of flows and proposes the energy objective, an alternative sample-based loss based on proper scoring rules. The energy objective is determinant-free and supports flexible model architectures that are not easily compatible with maximum likelihood training, including semi-autoregressive energy flows, a novel model family that interpolates between fully autoregressive and non-autoregressive models. Energy flows feature competitive sample quality, posterior inference, and generation speed relative to likelihood-based flows; this performance is decorrelated from the quality of log-likelihood estimates, which are generally very poor. Our findings question the use of maximum likelihood as an objective or a metric, and contribute to a scientific study of its role in generative modeling. Code is available at https://github.com/ps789/SAEF.
Cite
Text
Si et al. "Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows." International Conference on Machine Learning, 2023.Markdown
[Si et al. "Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/si2023icml-semiautoregressive/)BibTeX
@inproceedings{si2023icml-semiautoregressive,
title = {{Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows}},
author = {Si, Phillip and Chen, Zeyi and Sahoo, Subham Sekhar and Schiff, Yair and Kuleshov, Volodymyr},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {31732-31753},
volume = {202},
url = {https://mlanthology.org/icml/2023/si2023icml-semiautoregressive/}
}