Gaussian Process-Based Representation Learning via Timeseries Symmetries

Abstract

Credible forecasting and representation learning of dynamical systems are of ever-increasing importance for reliable decision-making. To that end, we propose a family of Gaussian processes for dynamical systems with linear time-invariant responses, which are nonlinear only in initial conditions. This linearity allows us to tractably quantify both forecasting and representational uncertainty simultaneously — alleviating the traditional challenge of multistep uncertainty propagation in GP models and enabling a new probabilistic treatment of learning representations. Using a novel data-based symmetrization, we improve the generalization ability of Gaussian processes and obtain tractable, continuous-time posteriors without the need for multiple models or approximate uncertainty propagation.

Cite

Text

Bevanda et al. "Gaussian Process-Based Representation Learning via Timeseries Symmetries." ICML 2024 Workshops: GRaM, 2024.

Markdown

[Bevanda et al. "Gaussian Process-Based Representation Learning via Timeseries Symmetries." ICML 2024 Workshops: GRaM, 2024.](https://mlanthology.org/icmlw/2024/bevanda2024icmlw-gaussian/)

BibTeX

@inproceedings{bevanda2024icmlw-gaussian,
  title     = {{Gaussian Process-Based Representation Learning via Timeseries Symmetries}},
  author    = {Bevanda, Petar and Beier, Max and Lederer, Armin and Capone, Alexandre and Sosnowski, Stefan Georg and Hirche, Sandra},
  booktitle = {ICML 2024 Workshops: GRaM},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/bevanda2024icmlw-gaussian/}
}