Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition
Abstract
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability. We propose the harmonic kernel decomposition (HKD), which uses Fourier series to decompose a kernel as a sum of orthogonal kernels. Our variational approximation exploits this orthogonality to enable a large number of inducing points at a low computational cost. We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections, and it significantly outperforms standard variational methods in scalability and accuracy. Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
Cite
Text
Sun et al. "Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition." International Conference on Machine Learning, 2021.Markdown
[Sun et al. "Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/sun2021icml-scalable/)BibTeX
@inproceedings{sun2021icml-scalable,
title = {{Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition}},
author = {Sun, Shengyang and Shi, Jiaxin and Wilson, Andrew Gordon Gordon and Grosse, Roger B},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {9955-9965},
volume = {139},
url = {https://mlanthology.org/icml/2021/sun2021icml-scalable/}
}