Mean Flows for One-Step Generative Modeling
Abstract
We propose a principled and effective framework for one-step generative modeling. We introduce the notion of average velocity to characterize flow fields, in contrast to instantaneous velocity modeled by Flow Matching methods. A well-defined identity between average and instantaneous velocities is derived and used to guide neural network training. Our method, termed the \textit{MeanFlow} model, is self-contained and requires no pre-training, distillation, or curriculum learning. MeanFlow demonstrates strong empirical performance: it achieves an FID of 3.43 with a single function evaluation (1-NFE) on ImageNet 256$\times$256 trained from scratch, significantly outperforming previous state-of-the-art one-step diffusion/flow models. Our study substantially narrows the gap between one-step diffusion/flow models and their multi-step predecessors, and we hope it will motivate future research to revisit the foundations of these powerful models.
Cite
Text
Geng et al. "Mean Flows for One-Step Generative Modeling." Advances in Neural Information Processing Systems, 2025.Markdown
[Geng et al. "Mean Flows for One-Step Generative Modeling." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/geng2025neurips-mean/)BibTeX
@inproceedings{geng2025neurips-mean,
title = {{Mean Flows for One-Step Generative Modeling}},
author = {Geng, Zhengyang and Deng, Mingyang and Bai, Xingjian and Kolter, J Zico and He, Kaiming},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/geng2025neurips-mean/}
}