Latent SDEs on Homogeneous Spaces

Abstract

We consider the problem of variational Bayesian inference in a latent variable model where a (possibly complex) observed stochastic process is governed by the unobserved solution of a latent stochastic differential equation (SDE). Motivated by the challenges that arise when trying to learn a latent SDE in $\mathbb{R}^n$ from large-scale data, such as efficient gradient computation, we take a step back and study a specific subclass instead. In our case, the SDE evolves inside a homogeneous latent space and is induced by stochastic dynamics of the corresponding (matrix) Lie group. In the context of learning problems, SDEs on the $n$-dimensional unit sphere are arguably the most relevant incarnation of this setup. For variational inference, the sphere not only facilitates using a uniform prior on the initial state of the SDE, but we also obtain a particularly simple and intuitive expression for the KL divergence between the approximate posterior and prior process in the evidence lower bound. We provide empirical evidence that a latent SDE of the proposed type can be learned efficiently by means of an existing one-step geometric Euler-Maruyama scheme. Despite restricting ourselves to a less diverse class of SDEs, we achieve competitive or even state-of-the-art performance on a collection of time series interpolation and classification benchmarks.

Cite

Text

Zeng et al. "Latent SDEs on Homogeneous Spaces." Neural Information Processing Systems, 2023.

Markdown

[Zeng et al. "Latent SDEs on Homogeneous Spaces." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/zeng2023neurips-latent/)

BibTeX

@inproceedings{zeng2023neurips-latent,
  title     = {{Latent SDEs on Homogeneous Spaces}},
  author    = {Zeng, Sebastian and Graf, Florian and Kwitt, Roland},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/zeng2023neurips-latent/}
}