A Theory of Continuous Generative Flow Networks
Abstract
Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.
Cite
Text
Lahlou et al. "A Theory of Continuous Generative Flow Networks." International Conference on Machine Learning, 2023.Markdown
[Lahlou et al. "A Theory of Continuous Generative Flow Networks." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/lahlou2023icml-theory/)BibTeX
@inproceedings{lahlou2023icml-theory,
title = {{A Theory of Continuous Generative Flow Networks}},
author = {Lahlou, Salem and Deleu, Tristan and Lemos, Pablo and Zhang, Dinghuai and Volokhova, Alexandra and Hernández-Garcı́a, Alex and Ezzine, Lena Nehale and Bengio, Yoshua and Malkin, Nikolay},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {18269-18300},
volume = {202},
url = {https://mlanthology.org/icml/2023/lahlou2023icml-theory/}
}