Generalization Errors and Learning Curves for Regression with Multi-Task Gaussian Processes
Abstract
We provide some insights into how task correlations in multi-task Gaussian process (GP) regression affect the generalization error and the learning curve. We analyze the asymmetric two-task case, where a secondary task is to help the learning of a primary task. Within this setting, we give bounds on the generalization error and the learning curve of the primary task. Our approach admits intuitive understandings of the multi-task GP by relating it to single-task GPs. For the case of one-dimensional input-space under optimal sampling with data only for the secondary task, the limitations of multi-task GP can be quantified explicitly.
Cite
Text
Chai. "Generalization Errors and Learning Curves for Regression with Multi-Task Gaussian Processes." Neural Information Processing Systems, 2009.Markdown
[Chai. "Generalization Errors and Learning Curves for Regression with Multi-Task Gaussian Processes." Neural Information Processing Systems, 2009.](https://mlanthology.org/neurips/2009/chai2009neurips-generalization/)BibTeX
@inproceedings{chai2009neurips-generalization,
title = {{Generalization Errors and Learning Curves for Regression with Multi-Task Gaussian Processes}},
author = {Chai, Kian M.},
booktitle = {Neural Information Processing Systems},
year = {2009},
pages = {279-287},
url = {https://mlanthology.org/neurips/2009/chai2009neurips-generalization/}
}