SDE Matching: Scalable and Simulation-Free Training of Latent Stochastic Differential Equations

Abstract

The Latent Stochastic Differential Equation (SDE) is a powerful tool for time series and sequence modeling. However, training Latent SDEs typically relies on adjoint sensitivity methods, which depend on simulation and backpropagation through approximate SDE solutions, which limit scalability. In this work, we propose SDE Matching, a new simulation-free method for training Latent SDEs. Inspired by modern Score- and Flow Matching algorithms for learning generative dynamics, we extend these ideas to the domain of stochastic dynamics for time series modeling, eliminating the need for costly numerical simulations. Our results demonstrate that SDE Matching achieves performance comparable to adjoint sensitivity methods while drastically reducing computational complexity.

Cite

Text

Bartosh et al. "SDE Matching: Scalable and Simulation-Free Training of Latent Stochastic Differential Equations." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Bartosh et al. "SDE Matching: Scalable and Simulation-Free Training of Latent Stochastic Differential Equations." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/bartosh2025icml-sde/)

BibTeX

@inproceedings{bartosh2025icml-sde,
  title     = {{SDE Matching: Scalable and Simulation-Free Training of Latent Stochastic Differential Equations}},
  author    = {Bartosh, Grigory and Vetrov, Dmitry and Naesseth, Christian A.},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {3054-3070},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/bartosh2025icml-sde/}
}