Additive Gaussian Processes Revisited
Abstract
Gaussian Process (GP) models are a class of flexible non-parametric models that have rich representational power. By using a Gaussian process with additive structure, complex responses can be modelled whilst retaining interpretability. Previous work showed that additive Gaussian process models require high-dimensional interaction terms. We propose the orthogonal additive kernel (OAK), which imposes an orthogonality constraint on the additive functions, enabling an identifiable, low-dimensional representation of the functional relationship. We connect the OAK kernel to functional ANOVA decomposition, and show improved convergence rates for sparse computation methods. With only a small number of additive low-dimensional terms, we demonstrate the OAK model achieves similar or better predictive performance compared to black-box models, while retaining interpretability.
Cite
Text
Lu et al. "Additive Gaussian Processes Revisited." International Conference on Machine Learning, 2022.Markdown
[Lu et al. "Additive Gaussian Processes Revisited." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/lu2022icml-additive/)BibTeX
@inproceedings{lu2022icml-additive,
title = {{Additive Gaussian Processes Revisited}},
author = {Lu, Xiaoyu and Boukouvalas, Alexis and Hensman, James},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {14358-14383},
volume = {162},
url = {https://mlanthology.org/icml/2022/lu2022icml-additive/}
}