Efficient Learning of Naive Bayes Classifiers Under Class-Conditional Classification Noise
Abstract
We address the problem of efficiently learning Naive Bayes classifiers under class-conditional classification noise (CCCN). Naive Bayes classifiers rely on the hypothesis that the distributions associated to each class are product distributions. When data is subject to CCC-noise, these conditional distributions are themselves mixtures of product distributions. We give analytical formulas which makes it possible to identify them from data subject to CCCN. Then, we design a learning algorithm based on these formulas able to learn Naive Bayes classifiers under CCCN. We present results on artificial datasets and datasets extracted from the UCI repository database. These results show that CCCN can be efficiently and successfully handled.
Cite
Text
Denis et al. "Efficient Learning of Naive Bayes Classifiers Under Class-Conditional Classification Noise." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143878Markdown
[Denis et al. "Efficient Learning of Naive Bayes Classifiers Under Class-Conditional Classification Noise." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/denis2006icml-efficient/) doi:10.1145/1143844.1143878BibTeX
@inproceedings{denis2006icml-efficient,
title = {{Efficient Learning of Naive Bayes Classifiers Under Class-Conditional Classification Noise}},
author = {Denis, François and Magnan, Christophe Nicolas and Ralaivola, Liva},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {265-272},
doi = {10.1145/1143844.1143878},
url = {https://mlanthology.org/icml/2006/denis2006icml-efficient/}
}