Stochastic Continuous Normalizing Flows: Training SDEs as ODEs

Abstract

We provide a general theoretical framework for stochastic continuous normalizing flows, an extension of continuous normalizing flows for density estimation of stochastic differential equations (SDEs). Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated. Doing so enables the treatment of SDEs as random ordinary differential equations, which can be trained using existing techniques. For scalar loss functions, this approach naturally recovers the stochastic adjoint method of Li et al. [2020] for training neural SDEs, while supporting a more flexible class of approximations.

Cite

Text

Hodgkinson et al. "Stochastic Continuous Normalizing Flows: Training SDEs as ODEs." Uncertainty in Artificial Intelligence, 2021.

Markdown

[Hodgkinson et al. "Stochastic Continuous Normalizing Flows: Training SDEs as ODEs." Uncertainty in Artificial Intelligence, 2021.](https://mlanthology.org/uai/2021/hodgkinson2021uai-stochastic/)

BibTeX

@inproceedings{hodgkinson2021uai-stochastic,
  title     = {{Stochastic Continuous Normalizing Flows: Training SDEs as ODEs}},
  author    = {Hodgkinson, Liam and Heide, Chris and Roosta, Fred and Mahoney, Michael W.},
  booktitle = {Uncertainty in Artificial Intelligence},
  year      = {2021},
  pages     = {1130-1140},
  volume    = {161},
  url       = {https://mlanthology.org/uai/2021/hodgkinson2021uai-stochastic/}
}