Lifelong Learning with Gaussian Processes
Abstract
Recent developments in lifelong machine learning have demonstrated that it is possible to learn multiple tasks consecutively, transferring knowledge between those tasks to accelerate learning and improve performance. However, these methods are limited to using linear parametric base learners, substantially restricting the predictive power of the resulting models. We present a lifelong learning algorithm that can support non-parametric models, focusing on Gaussian processes. To enable efficient online transfer between Gaussian process models, our approach assumes a factorized formulation of the covariance functions, and incrementally learns a shared sparse basis for the models’ parameterizations. We show that this lifelong learning approach is highly computationally efficient, and outperforms existing methods on a variety of data sets.
Cite
Text
Clingerman and Eaton. "Lifelong Learning with Gaussian Processes." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2017. doi:10.1007/978-3-319-71246-8_42Markdown
[Clingerman and Eaton. "Lifelong Learning with Gaussian Processes." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2017.](https://mlanthology.org/ecmlpkdd/2017/clingerman2017ecmlpkdd-lifelong/) doi:10.1007/978-3-319-71246-8_42BibTeX
@inproceedings{clingerman2017ecmlpkdd-lifelong,
title = {{Lifelong Learning with Gaussian Processes}},
author = {Clingerman, Christopher and Eaton, Eric},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2017},
pages = {690-704},
doi = {10.1007/978-3-319-71246-8_42},
url = {https://mlanthology.org/ecmlpkdd/2017/clingerman2017ecmlpkdd-lifelong/}
}