The Generalized FITC Approximation

Abstract

We present an efficient generalization of the sparse pseudo-input Gaussian pro- cess (SPGP) model developed by Snelson and Ghahramani [1], applying it to binary classification problems. By taking advantage of the SPGP prior covari- ance structure, we derive a numerically stable algorithm with O(N M 2) training complexity—asymptotically the same as related sparse methods such as the in- formative vector machine [2], but which more faithfully represents the posterior. We present experimental results for several benchmark problems showing that in many cases this allows an exceptional degree of sparsity without compromis- ing accuracy. Following [1], we locate pseudo-inputs by gradient ascent on the marginal likelihood, but exhibit occasions when this is likely to fail, for which we suggest alternative solutions.

Cite

Text

Naish-guzman and Holden. "The Generalized FITC Approximation." Neural Information Processing Systems, 2007.

Markdown

[Naish-guzman and Holden. "The Generalized FITC Approximation." Neural Information Processing Systems, 2007.](https://mlanthology.org/neurips/2007/naishguzman2007neurips-generalized/)

BibTeX

@inproceedings{naishguzman2007neurips-generalized,
  title     = {{The Generalized FITC Approximation}},
  author    = {Naish-guzman, Andrew and Holden, Sean},
  booktitle = {Neural Information Processing Systems},
  year      = {2007},
  pages     = {1057-1064},
  url       = {https://mlanthology.org/neurips/2007/naishguzman2007neurips-generalized/}
}