Error Bounds for Flow Matching Methods
Abstract
Score-based generative models are a popular class of generative modelling techniques relying on stochastic differential equations (SDEs). From their inception, it was realized that it was also possible to perform generation using ordinary differential equations (ODEs) rather than SDEs. This led to the introduction of the probability flow ODE approach and denoising diffusion implicit models. Flow matching methods have recently further extended these ODE-based approaches and approximate a flow between two arbitrary probability distributions. Previous work derived bounds on the approximation error of diffusion models under the stochastic sampling regime, given assumptions on the $L^2$ loss. We present error bounds for the flow matching procedure using fully deterministic sampling, assuming an $L^2$ bound on the approximation error and a certain regularity condition on the data distributions.
Cite
Text
Benton et al. "Error Bounds for Flow Matching Methods." Transactions on Machine Learning Research, 2024.Markdown
[Benton et al. "Error Bounds for Flow Matching Methods." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/benton2024tmlr-error/)BibTeX
@article{benton2024tmlr-error,
title = {{Error Bounds for Flow Matching Methods}},
author = {Benton, Joe and Deligiannidis, George and Doucet, Arnaud},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/benton2024tmlr-error/}
}