Flow Matching Achieves Almost Minimax Optimal Convergence
Abstract
Flow matching (FM) has gained significant attention as a simulation-free generative model. Unlike diffusion models, which are based on stochastic differential equations, FM employs a simpler approach by solving an ordinary differential equation with an initial condition from a normal distribution, thus streamlining the sample generation process. This paper discusses the convergence properties of FM in terms of the $p$-Wasserstein distance, a measure of distributional discrepancy. We establish that FM can achieve an almost minimax optimal convergence rate for $1 \leq p \leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models. Our analysis extends existing frameworks by examining a broader class of mean and variance functions for the vector fields and identifies specific conditions necessary to attain these optimal rates.
Cite
Text
Fukumizu et al. "Flow Matching Achieves Almost Minimax Optimal Convergence." International Conference on Learning Representations, 2025.Markdown
[Fukumizu et al. "Flow Matching Achieves Almost Minimax Optimal Convergence." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/fukumizu2025iclr-flow/)BibTeX
@inproceedings{fukumizu2025iclr-flow,
title = {{Flow Matching Achieves Almost Minimax Optimal Convergence}},
author = {Fukumizu, Kenji and Suzuki, Taiji and Isobe, Noboru and Oko, Kazusato and Koyama, Masanori},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/fukumizu2025iclr-flow/}
}