A Hybrid Nearest-Neighbor and Nearest-Hyperrectangle Algorithm

Abstract

Algorithms based on Nested Generalized Exemplar (NGE) theory [10] classify new data points by computing their distance to the nearest “generalized exemplar” (i.e. an axis-parallel multidimensional rectangle). An improved version of NGE, called BNGE, was previously shown to perform comparably to the Nearest Neighbor algorithm. Advantages of the NGE approach include compact representation of the training data and fast training and classification. A hybrid method that combines BNGE and the k-Nearest Neighbor algorithm, called KBNGE, is introduced for improved classification accuracy. Results from eleven domains show that KBNGE achieves generalization accuracies similar to the k-Nearest Neighbor algorithm at improved classification speed. KBNGE is a fast and easy to use inductive learning algorithm that gives very accurate predictions in a variety of domains and represents the learned knowledge in a manner that can be easily interpreted by the user.

Cite

Text

Wettschereck. "A Hybrid Nearest-Neighbor and Nearest-Hyperrectangle Algorithm." European Conference on Machine Learning, 1994. doi:10.1007/3-540-57868-4_67

Markdown

[Wettschereck. "A Hybrid Nearest-Neighbor and Nearest-Hyperrectangle Algorithm." European Conference on Machine Learning, 1994.](https://mlanthology.org/ecmlpkdd/1994/wettschereck1994ecml-hybrid/) doi:10.1007/3-540-57868-4_67

BibTeX

@inproceedings{wettschereck1994ecml-hybrid,
  title     = {{A Hybrid Nearest-Neighbor and Nearest-Hyperrectangle Algorithm}},
  author    = {Wettschereck, Dietrich},
  booktitle = {European Conference on Machine Learning},
  year      = {1994},
  pages     = {323-335},
  doi       = {10.1007/3-540-57868-4_67},
  url       = {https://mlanthology.org/ecmlpkdd/1994/wettschereck1994ecml-hybrid/}
}