Improving and Generalizing Flow-Based Generative Models with Minibatch Optimal Transport

Abstract

Continuous normalizing flows (CNFs) are an attractive generative modeling technique, but they have been held back by limitations in their simulation-based maximum likelihood training. We introduce the generalized \textit{conditional flow matching} (CFM) technique, a family of simulation-free training objectives for CNFs. CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models. In contrast to both diffusion models and prior CNF training algorithms, CFM does not require the source distribution to be Gaussian or require evaluation of its density. A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference, as evaluated in our experiments. Furthermore, OT-CFM is the first method to compute dynamic OT in a simulation-free way. Training CNFs with CFM improves results on a variety of conditional and unconditional generation tasks, such as inferring single cell dynamics, unsupervised image translation, and Schrödinger bridge inference.

Cite

Text

Tong et al. "Improving and Generalizing Flow-Based Generative Models with Minibatch Optimal Transport." ICML 2023 Workshops: Frontiers4LCD, 2023.

Markdown

[Tong et al. "Improving and Generalizing Flow-Based Generative Models with Minibatch Optimal Transport." ICML 2023 Workshops: Frontiers4LCD, 2023.](https://mlanthology.org/icmlw/2023/tong2023icmlw-improving/)

BibTeX

@inproceedings{tong2023icmlw-improving,
  title     = {{Improving and Generalizing Flow-Based Generative Models with Minibatch Optimal Transport}},
  author    = {Tong, Alexander and Malkin, Nikolay and Huguet, Guillaume and Zhang, Yanlei and Rector-Brooks, Jarrid and Fatras, Kilian and Wolf, Guy and Bengio, Yoshua},
  booktitle = {ICML 2023 Workshops: Frontiers4LCD},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/tong2023icmlw-improving/}
}