Ergodic Generative Flows
Abstract
Generative Flow Networks (GFNs) were initially introduced on directed non-acyclic graphs to sample from an unnormalized distribution density. Recent works have extended the theoretical framework for generative methods allowing more flexibility and enhancing application range. However, many challenges remain in training GFNs in continuous settings and for imitation learning (IL), including intractability of flow-matching loss, limited tests of non-acyclic training, and the need for a separate reward model in imitation learning. The present work proposes a family of generative flows called Ergodic Generative Flows (EGFs) which are used to address the aforementioned issues. First, we leverage ergodicity to build simple generative flows with finitely many globally defined transformations (diffeomorphisms) with universality guarantees and tractable flow-matching loss (FM loss). Second, we introduce a new loss involving cross-entropy coupled to weak flow-matching control, coined KL-weakFM loss. It is designed for IL training without a separate reward model. We evaluate IL-EGFs on toy 2D tasks and real-world datasets from NASA on the sphere, using the KL-weakFM loss. Additionally, we conduct toy 2D reinforcement learning experiments with a target reward, using the FM loss.
Cite
Text
Brunswic et al. "Ergodic Generative Flows." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Brunswic et al. "Ergodic Generative Flows." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/brunswic2025icml-ergodic/)BibTeX
@inproceedings{brunswic2025icml-ergodic,
title = {{Ergodic Generative Flows}},
author = {Brunswic, Leo Maxime and Clémente, Mateo and Yang, Rui Heng and Sigal, Adam and Rasouli, Amir and Li, Yinchuan},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {5649-5668},
volume = {267},
url = {https://mlanthology.org/icml/2025/brunswic2025icml-ergodic/}
}