Sparse Gaussian Processes for Stochastic Differential Equations

Abstract

We frame the problem of learning stochastic differential equations (SDEs) from noisy observations as an inference problem and aim to maximize the marginal likelihood of the observations in a joint model of the latent paths and the noisy observations. As this problem is intractable, we derive an approximate (variational) inference algorithm and propose a novel parameterization of the approximate distribution over paths using a sparse Markovian Gaussian process. The approximation is efficient in storage and computation, allowing the usage of well-established optimizing algorithms such as natural gradient descent. We demonstrate the capability of the proposed method on the Ornstein-Uhlenbeck process.

Cite

Text

Verma et al. "Sparse Gaussian Processes for Stochastic Differential Equations." NeurIPS 2021 Workshops: DLDE, 2021.

Markdown

[Verma et al. "Sparse Gaussian Processes for Stochastic Differential Equations." NeurIPS 2021 Workshops: DLDE, 2021.](https://mlanthology.org/neuripsw/2021/verma2021neuripsw-sparse/)

BibTeX

@inproceedings{verma2021neuripsw-sparse,
  title     = {{Sparse Gaussian Processes for Stochastic Differential Equations}},
  author    = {Verma, Prakhar and Adam, Vincent and Solin, Arno},
  booktitle = {NeurIPS 2021 Workshops: DLDE},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/verma2021neuripsw-sparse/}
}