Memory-Based Dual Gaussian Processes for Sequential Learning

Abstract

Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.

Cite

Text

Chang et al. "Memory-Based Dual Gaussian Processes for Sequential Learning." International Conference on Machine Learning, 2023.

Markdown

[Chang et al. "Memory-Based Dual Gaussian Processes for Sequential Learning." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chang2023icml-memorybased/)

BibTeX

@inproceedings{chang2023icml-memorybased,
  title     = {{Memory-Based Dual Gaussian Processes for Sequential Learning}},
  author    = {Chang, Paul Edmund and Verma, Prakhar and John, S. T. and Solin, Arno and Khan, Mohammad Emtiyaz},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {4035-4054},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/chang2023icml-memorybased/}
}