KernelMatmul: Scaling Gaussian Processes to Large Time Series
Abstract
Time series forecasting requires reliable uncertainty estimates. Gaussian process regression provides a powerful framework for modelling this in a probabilistic fashion. However, its application to large time series is challenging, due to its cubic time complexity and quadratic memory requirement. In this work, we present KernelMatmul, a novel method that accelerates Gaussian process inference and thus facilitates scaling of Gaussian process regression to large, irregularly sampled and multi-output time series. Leveraging conjugate gradients in combination with sparsity approximation, KernelMatmul achieves time and memory complexity linear in the number of samples. We thoroughly benchmark our new method against multiple baselines to demonstrate its benefits and limitations, both in efficiency and accuracy.
Cite
Text
Hoffbauer et al. "KernelMatmul: Scaling Gaussian Processes to Large Time Series." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I16.33893Markdown
[Hoffbauer et al. "KernelMatmul: Scaling Gaussian Processes to Large Time Series." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/hoffbauer2025aaai-kernelmatmul/) doi:10.1609/AAAI.V39I16.33893BibTeX
@inproceedings{hoffbauer2025aaai-kernelmatmul,
title = {{KernelMatmul: Scaling Gaussian Processes to Large Time Series}},
author = {Hoffbauer, Tilman and Hoos, Holger H. and Bossek, Jakob},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {17223-17230},
doi = {10.1609/AAAI.V39I16.33893},
url = {https://mlanthology.org/aaai/2025/hoffbauer2025aaai-kernelmatmul/}
}