Improving Consistency Models with Generator-Augmented Flows

Abstract

Consistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network. They can be learned in two ways: consistency distillation and consistency training. The former relies on the true velocity field of the corresponding differential equation, approximated by a pre-trained neural network. In contrast, the latter uses a single-sample Monte Carlo estimate of this velocity field. The related estimation error induces a discrepancy between consistency distillation and training that, we show, still holds in the continuous-time limit. To alleviate this issue, we propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model. We prove that this flow reduces the previously identified discrepancy and the noise-data transport cost. Consequently, our method not only accelerates consistency training convergence but also enhances its overall performance. The code is available at https://github.com/thibautissenhuth/consistency_GC.

Cite

Text

Issenhuth et al. "Improving Consistency Models with Generator-Augmented Flows." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Issenhuth et al. "Improving Consistency Models with Generator-Augmented Flows." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/issenhuth2025icml-improving/)

BibTeX

@inproceedings{issenhuth2025icml-improving,
  title     = {{Improving Consistency Models with Generator-Augmented Flows}},
  author    = {Issenhuth, Thibaut and Lee, Sangchul and Dos Santos, Ludovic and Franceschi, Jean-Yves and Kim, Chansoo and Rakotomamonjy, Alain},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {26586-26610},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/issenhuth2025icml-improving/}
}