Infinite-Horizon Gaussian Processes

Abstract

Gaussian processes provide a flexible framework for forecasting, removing noise, and interpreting long temporal datasets. State space modelling (Kalman filtering) enables these non-parametric models to be deployed on long datasets by reducing the complexity to linear in the number of data points. The complexity is still cubic in the state dimension m which is an impediment to practical application. In certain special cases (Gaussian likelihood, regular spacing) the GP posterior will reach a steady posterior state when the data are very long. We leverage this and formulate an inference scheme for GPs with general likelihoods, where inference is based on single-sweep EP (assumed density filtering). The infinite-horizon model tackles the cubic cost in the state dimensionality and reduces the cost in the state dimension m to O(m^2) per data point. The model is extended to online-learning of hyperparameters. We show examples for large finite-length modelling problems, and present how the method runs in real-time on a smartphone on a continuous data stream updated at 100 Hz.

Cite

Text

Solin et al. "Infinite-Horizon Gaussian Processes." Neural Information Processing Systems, 2018.

Markdown

[Solin et al. "Infinite-Horizon Gaussian Processes." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/solin2018neurips-infinitehorizon/)

BibTeX

@inproceedings{solin2018neurips-infinitehorizon,
  title     = {{Infinite-Horizon Gaussian Processes}},
  author    = {Solin, Arno and Hensman, James and Turner, Richard E},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {3486-3495},
  url       = {https://mlanthology.org/neurips/2018/solin2018neurips-infinitehorizon/}
}