Symbolic Nearest Mean Classifiers
Abstract
The minimum-distance classifier summarizes each class with a prototype and then uses a nearest neighbor approach for classification. Three drawbacks of the minimum-distance classifier are its inability to work with symbolic attributes, weigh attributes, and learn more than a single prototype for each class. The proposed solutions to these problems include defining the mean for symbolic attributes, providing a weighting metric, and learning several possible prototypes for each class. The learning algorithm developed to tackle these problems, SNMC, increases classification accuracy by 10% over the original minimum-distance classifier and has a higher average generalization accuracy than both C4.5 and PEBLS on 20 domains from the UCI data repository. Introduction The instance-based (Aha, Kibler, & Albert, 1991) or nearest neighbor learning method (Duda & Hart, 1973) is a traditional statistical pattern recognition method for classifying unseen examples. These methods store the training ...
Cite
Text
Datta and Kibler. "Symbolic Nearest Mean Classifiers." AAAI Conference on Artificial Intelligence, 1997.Markdown
[Datta and Kibler. "Symbolic Nearest Mean Classifiers." AAAI Conference on Artificial Intelligence, 1997.](https://mlanthology.org/aaai/1997/datta1997aaai-symbolic/)BibTeX
@inproceedings{datta1997aaai-symbolic,
title = {{Symbolic Nearest Mean Classifiers}},
author = {Datta, Piew and Kibler, Dennis F.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {1997},
pages = {82-87},
url = {https://mlanthology.org/aaai/1997/datta1997aaai-symbolic/}
}