Laplace Approximated Gaussian Process State-Space Models
Abstract
Gaussian process state-space models describe time series data in a probabilistic and non-parametric manner using a Gaussian process transition function. As inference is intractable, recent methods use variational inference and either rely on simplifying independence assumptions on the approximate posterior or learn the temporal states iteratively. The latter hampers optimization since the posterior over the presence can only be learned once the posterior governing the past has converged. We present a novel inference scheme that applies stochastic variational inference for the Gaussian process posterior and the Laplace approximation on the temporal states. This approach respects the conditional dependencies in the model and, through the Laplace approximation, treats the temporal states jointly, thereby avoiding their sequential learning. Our method is computationally efficient and leads to better calibrated predictions compared to state-of-the art alternatives on synthetic data and on a range of benchmark datasets.
Cite
Text
Lindinger et al. "Laplace Approximated Gaussian Process State-Space Models." Uncertainty in Artificial Intelligence, 2022.Markdown
[Lindinger et al. "Laplace Approximated Gaussian Process State-Space Models." Uncertainty in Artificial Intelligence, 2022.](https://mlanthology.org/uai/2022/lindinger2022uai-laplace/)BibTeX
@inproceedings{lindinger2022uai-laplace,
title = {{Laplace Approximated Gaussian Process State-Space Models}},
author = {Lindinger, Jakob and Rakitsch, Barbara and Lippert, Christoph},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2022},
pages = {1199-1209},
volume = {180},
url = {https://mlanthology.org/uai/2022/lindinger2022uai-laplace/}
}