Locally Adaptive Classification Piloted by Uncertainty
Abstract
Locally adaptive classifiers are usually superior to the use of a single global classifier. However, there are two major problems in designing locally adaptive classifiers. First, how to place the local classifiers, and, second, how to combine them together. In this paper, instead of placing the classifiers based on the data distribution only, we propose a responsibility mixture model that uses the uncertainty associated with the classification at each training sample. Using this model, the local classifiers are placed near the decision boundary where they are most effective. A set of local classifiers are then learned to form a global classifier by maximizing an estimate of the probability that the samples will be correctly classified with a nearest neighbor classifier. Experimental results on both artificial and real-world data sets demonstrate its superiority over traditional algorithms.
Cite
Text
Dai et al. "Locally Adaptive Classification Piloted by Uncertainty." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143873Markdown
[Dai et al. "Locally Adaptive Classification Piloted by Uncertainty." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/dai2006icml-locally/) doi:10.1145/1143844.1143873BibTeX
@inproceedings{dai2006icml-locally,
title = {{Locally Adaptive Classification Piloted by Uncertainty}},
author = {Dai, Juan and Yan, Shuicheng and Tang, Xiaoou and Kwok, James T.},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {225-232},
doi = {10.1145/1143844.1143873},
url = {https://mlanthology.org/icml/2006/dai2006icml-locally/}
}