Incorporating Dependencies in Spectral Kernels for Gaussian Processes
Abstract
Gaussian processes (GPs) are an elegant Bayesian approach to model an unknown function. The choice of the kernel characterizes one’s assumption on how the unknown function autocovaries. It is a core aspect of a GP design, since the posterior distribution can significantly vary for different kernels. The spectral mixture (SM) kernel is derived by modelling a spectral density - the Fourier transform of a kernel - with a linear mixture of Gaussian components. As such, the SM kernel cannot model dependencies between components. In this paper we use cross convolution to model dependencies between components and derive a new kernel called Generalized Convolution Spectral Mixture (GCSM). Experimental analysis of GCSM on synthetic and real-life datasets indicates the benefit of modeling dependencies between components for reducing uncertainty and for improving performance in extrapolation tasks.
Cite
Text
Chen et al. "Incorporating Dependencies in Spectral Kernels for Gaussian Processes." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2019. doi:10.1007/978-3-030-46147-8_34Markdown
[Chen et al. "Incorporating Dependencies in Spectral Kernels for Gaussian Processes." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2019.](https://mlanthology.org/ecmlpkdd/2019/chen2019ecmlpkdd-incorporating/) doi:10.1007/978-3-030-46147-8_34BibTeX
@inproceedings{chen2019ecmlpkdd-incorporating,
title = {{Incorporating Dependencies in Spectral Kernels for Gaussian Processes}},
author = {Chen, Kai and van Laarhoven, Twan and Chen, Jinsong and Marchiori, Elena},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2019},
pages = {565-581},
doi = {10.1007/978-3-030-46147-8_34},
url = {https://mlanthology.org/ecmlpkdd/2019/chen2019ecmlpkdd-incorporating/}
}