Semi-Described and Semi-Supervised Learning with Gaussian Processes
Abstract
Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as "semi-described learning". We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.
Cite
Text
Damianou and Lawrence. "Semi-Described and Semi-Supervised Learning with Gaussian Processes." Conference on Uncertainty in Artificial Intelligence, 2015.Markdown
[Damianou and Lawrence. "Semi-Described and Semi-Supervised Learning with Gaussian Processes." Conference on Uncertainty in Artificial Intelligence, 2015.](https://mlanthology.org/uai/2015/damianou2015uai-semi/)BibTeX
@inproceedings{damianou2015uai-semi,
title = {{Semi-Described and Semi-Supervised Learning with Gaussian Processes}},
author = {Damianou, Andreas C. and Lawrence, Neil D.},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2015},
pages = {228-237},
url = {https://mlanthology.org/uai/2015/damianou2015uai-semi/}
}