Extended and Unscented Kitchen Sinks
Abstract
We propose a scalable multiple-output generalization of unscented and extended Gaussian processes. These algorithms have been designed to handle general likelihood models by linearizing them using a Taylor series or the Unscented Transform in a variational inference framework. We build upon random feature approximations of Gaussian process covariance functions and show that, on small-scale single-task problems, our methods can attain similar performance as the original algorithms while having less computational cost. We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem.
Cite
Text
Bonilla et al. "Extended and Unscented Kitchen Sinks." International Conference on Machine Learning, 2016.Markdown
[Bonilla et al. "Extended and Unscented Kitchen Sinks." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/bonilla2016icml-extended/)BibTeX
@inproceedings{bonilla2016icml-extended,
title = {{Extended and Unscented Kitchen Sinks}},
author = {Bonilla, Edwin and Steinberg, Daniel and Reid, Alistair},
booktitle = {International Conference on Machine Learning},
year = {2016},
pages = {1651-1659},
volume = {48},
url = {https://mlanthology.org/icml/2016/bonilla2016icml-extended/}
}