ReDi: Rectified Discrete Flow
Abstract
Discrete Flow-based Models (DFMs) are powerful generative models for high-quality discrete data but typically suffer from slow sampling speeds due to their reliance on iterative decoding processes. This reliance on a multi-step process originates from the factorization approximation of DFMs, which is necessary for handling high-dimensional data. In this paper, we analyze the factorization approximation error using Conditional Total Correlation (TC), and reveal its dependence on the coupling. To address the challenge of efficient few-step generation, we propose Rectified Discrete Flow (ReDi), a novel iterative method that reduces the underlying factorization error (measured as Conditional TC) by rectifying the coupling between source and target distributions. We theoretically prove that each ReDi step guarantees a monotonic decreasing Conditional TC, ensuring its convergence. Empirically, ReDi significantly reduces Conditional TC and enables few-step generation. Moreover, we demonstrate that the rectified couplings are well-suited for training efficient one-step models on image generation. ReDi offers a simple and theoretically grounded approach for tackling the few-step challenge, providing a new perspective on efficient discrete data synthesis. Code is available at https://github.com/Ugness/ReDi_discrete.
Cite
Text
Yoo et al. "ReDi: Rectified Discrete Flow." Advances in Neural Information Processing Systems, 2025.Markdown
[Yoo et al. "ReDi: Rectified Discrete Flow." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/yoo2025neurips-redi/)BibTeX
@inproceedings{yoo2025neurips-redi,
title = {{ReDi: Rectified Discrete Flow}},
author = {Yoo, Jaehoon and Kim, Wonjung and Hong, Seunghoon},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/yoo2025neurips-redi/}
}