Conditioning Non-Linear and Infinite-Dimensional Diffusion Processes

Abstract

Generative diffusion models and many stochastic models in science and engineering naturally live in infinite dimensions before discretisation. To incorporate observed data for statistical and learning tasks, one needs to condition on observations. While recent work has treated conditioning linear processes in infinite dimensions, conditioning non-linear processes in infinite dimensions has not been explored. This paper conditions function valued stochastic processes without prior discretisation. To do so, we use an infinite-dimensional version of Girsanov's theorem to condition a function-valued stochastic process, leading to a stochastic differential equation (SDE) for the conditioned process involving the score. We apply this technique to do time series analysis for shapes of organisms in evolutionary biology, where we discretise via the Fourier basis and then learn the coefficients of the score function with score matching methods.

Cite

Text

Baker et al. "Conditioning Non-Linear and Infinite-Dimensional Diffusion Processes." Neural Information Processing Systems, 2024. doi:10.52202/079017-0345

Markdown

[Baker et al. "Conditioning Non-Linear and Infinite-Dimensional Diffusion Processes." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/baker2024neurips-conditioning/) doi:10.52202/079017-0345

BibTeX

@inproceedings{baker2024neurips-conditioning,
  title     = {{Conditioning Non-Linear and Infinite-Dimensional Diffusion Processes}},
  author    = {Baker, Elizabeth Louise and Yang, Gefan and Severinsen, Michael L. and Hipsley, Christy Anna and Sommer, Stefan},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0345},
  url       = {https://mlanthology.org/neurips/2024/baker2024neurips-conditioning/}
}