Learning When Negative Examples Abound
Abstract
Existing concept learning systems can fail when the negative examples heavily outnumber the positive examples. The paper discusses one essential trouble brought about by imbalanced training sets and presents a learning algorithm addressing this issue. The experiments (with synthetic and real-world data) focus on 2-class problems with examples described with binary and continuous attributes.
Cite
Text
Kubat et al. "Learning When Negative Examples Abound." European Conference on Machine Learning, 1997. doi:10.1007/3-540-62858-4_79Markdown
[Kubat et al. "Learning When Negative Examples Abound." European Conference on Machine Learning, 1997.](https://mlanthology.org/ecmlpkdd/1997/kubat1997ecml-learning/) doi:10.1007/3-540-62858-4_79BibTeX
@inproceedings{kubat1997ecml-learning,
title = {{Learning When Negative Examples Abound}},
author = {Kubat, Miroslav and Holte, Robert C. and Matwin, Stan},
booktitle = {European Conference on Machine Learning},
year = {1997},
pages = {146-153},
doi = {10.1007/3-540-62858-4_79},
url = {https://mlanthology.org/ecmlpkdd/1997/kubat1997ecml-learning/}
}