Bayesian Feature Weighting for Unsupervised Learning, with Application to Object Recognition
Abstract
We present a method for variable selection/weighting in an unsupervised learning context using Bayesian shrinkage. The basis for the model is a finite mixture of multivariate Gaussian distributions. We demonstrate how the model parameters and cluster assignments can be computed simultaneously using an efficient EM algorithm. Applying our Bayesian shrinkage model to a complex problem in object recognition (Duygulu, Barnard, de Freitas and Forsyth 2002), our experiments yield good results.
Cite
Text
Gustafson et al. "Bayesian Feature Weighting for Unsupervised Learning, with Application to Object Recognition." Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, 2003.Markdown
[Gustafson et al. "Bayesian Feature Weighting for Unsupervised Learning, with Application to Object Recognition." Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, 2003.](https://mlanthology.org/aistats/2003/gustafson2003aistats-bayesian/)BibTeX
@inproceedings{gustafson2003aistats-bayesian,
title = {{Bayesian Feature Weighting for Unsupervised Learning, with Application to Object Recognition}},
author = {Gustafson, Paul and Carbonetto, Peter and Thompson, Natalie and Freitas, Nando},
booktitle = {Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics},
year = {2003},
pages = {124-131},
volume = {R4},
url = {https://mlanthology.org/aistats/2003/gustafson2003aistats-bayesian/}
}