The Bayesian Backfitting Relevance Vector Machine
Abstract
Traditional non-parametric statistical learning techniques are oftencomputationally attractive, but lack the same generalization and modelselection abilities as state-of-the-art Bayesian algorithms which, however, are usually computationally prohibitive. This paper makes several importantcontributions that allow Bayesian learning to scale to more complex, real-world learning scenarios. Firstly, we show that backfitting -- atraditional non-parametric, yet highly efficient regression tool -- can bederived in a novel formulation within an expectation maximization (EM)framework and thus can finally be given a probabilistic interpretation. Secondly, we show that the general framework of sparse Bayesian learning andin particular the relevance vector machine (RVM), can be derived as a highlyefficient algorithm using a Bayesian version of backfitting at its core. As wedemonstrate on several regression and classification benchmarks, Bayesianbackfitting offers a compelling alternative to current regression methods, especially when the size and dimensionality of the data challengecomputational resources.
Cite
Text
D'Souza et al. "The Bayesian Backfitting Relevance Vector Machine." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015358Markdown
[D'Souza et al. "The Bayesian Backfitting Relevance Vector Machine." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/dapossouza2004icml-bayesian/) doi:10.1145/1015330.1015358BibTeX
@inproceedings{dapossouza2004icml-bayesian,
title = {{The Bayesian Backfitting Relevance Vector Machine}},
author = {D'Souza, Aaron and Vijayakumar, Sethu and Schaal, Stefan},
booktitle = {International Conference on Machine Learning},
year = {2004},
doi = {10.1145/1015330.1015358},
url = {https://mlanthology.org/icml/2004/dapossouza2004icml-bayesian/}
}