Exploiting Informative Priors for Bayesian Classification and Regression Trees
Abstract
A general method for defining informative priors \non statistical models is presented and applied \nspecifically to the space of classification and regression \ntrees. A Bayesian approach to learning such \nmodels from data is taken, with the Metropolis- \nHastings algorithm being used to approximately \nsample from the posterior. By only using proposal \ndistributions closely tied to the prior, acceptance \nprobabilities are easily computable via marginal \nlikelihood ratios, whatever the prior used. Our approach \nis empirically tested by varying (i) the data, \n(ii) the prior and (iii) the proposal distribution. A \ncomparison with related work is given.
Cite
Text
Angelopoulos and Cussens. "Exploiting Informative Priors for Bayesian Classification and Regression Trees." International Joint Conference on Artificial Intelligence, 2005.Markdown
[Angelopoulos and Cussens. "Exploiting Informative Priors for Bayesian Classification and Regression Trees." International Joint Conference on Artificial Intelligence, 2005.](https://mlanthology.org/ijcai/2005/angelopoulos2005ijcai-exploiting/)BibTeX
@inproceedings{angelopoulos2005ijcai-exploiting,
title = {{Exploiting Informative Priors for Bayesian Classification and Regression Trees}},
author = {Angelopoulos, Nicos and Cussens, James},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2005},
pages = {641-646},
url = {https://mlanthology.org/ijcai/2005/angelopoulos2005ijcai-exploiting/}
}