Accelerating Non-Conjugate Gaussian Processes by Trading Off Computation for Uncertainty
Abstract
Non-conjugate Gaussian processes (NCGPs) define a flexible probabilistic framework to model categorical, ordinal and continuous data, and are widely used in practice. However, exact inference in NCGPs is prohibitively expensive for large datasets, thus requiring approximations in practice. The approximation error adversely impacts the reliability of the model and is not accounted for in the uncertainty of the prediction. We introduce a family of iterative methods that explicitly model this error. They are uniquely suited to parallel modern computing hardware, efficiently recycle computations, and compress information to reduce both the time and memory requirements for NCGPs. As we demonstrate on large-scale classification problems, our method significantly accelerates posterior inference compared to competitive baselines by trading off reduced computation for increased uncertainty.
Cite
Text
Tatzel et al. "Accelerating Non-Conjugate Gaussian Processes by Trading Off Computation for Uncertainty." Transactions on Machine Learning Research, 2025.Markdown
[Tatzel et al. "Accelerating Non-Conjugate Gaussian Processes by Trading Off Computation for Uncertainty." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/tatzel2025tmlr-accelerating/)BibTeX
@article{tatzel2025tmlr-accelerating,
title = {{Accelerating Non-Conjugate Gaussian Processes by Trading Off Computation for Uncertainty}},
author = {Tatzel, Lukas and Wenger, Jonathan and Schneider, Frank and Hennig, Philipp},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/tatzel2025tmlr-accelerating/}
}