Sparse On-Line Gaussian Processes

Abstract

We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space, recursions for the effective parameters and a sparse gaussian approximation of the posterior process are obtained. This allows for both a propagation of predictions and Bayesian error measures. The significance and robustness of our approach are demonstrated on a variety of experiments.

Cite

Text

Csató and Opper. "Sparse On-Line Gaussian Processes." Neural Computation, 2002. doi:10.1162/089976602317250933

Markdown

[Csató and Opper. "Sparse On-Line Gaussian Processes." Neural Computation, 2002.](https://mlanthology.org/neco/2002/csato2002neco-sparse/) doi:10.1162/089976602317250933

BibTeX

@article{csato2002neco-sparse,
  title     = {{Sparse On-Line Gaussian Processes}},
  author    = {Csató, Lehel and Opper, Manfred},
  journal   = {Neural Computation},
  year      = {2002},
  pages     = {641-668},
  doi       = {10.1162/089976602317250933},
  volume    = {14},
  url       = {https://mlanthology.org/neco/2002/csato2002neco-sparse/}
}