An Adaptive Metric Machine for Pattern Classification

Abstract

Nearest neighbor classification assumes locally constant class con(cid:173) ditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chi-squared distance analysis to compute a flexible metric for pro(cid:173) ducing neighborhoods that are elongated along less relevant feature dimensions and constricted along most influential ones. As a result, the class conditional probabilities tend to be smoother in the mod(cid:173) ified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other techniques using a variety of real world data.

Cite

Text

Domeniconi et al. "An Adaptive Metric Machine for Pattern Classification." Neural Information Processing Systems, 2000.

Markdown

[Domeniconi et al. "An Adaptive Metric Machine for Pattern Classification." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/domeniconi2000neurips-adaptive/)

BibTeX

@inproceedings{domeniconi2000neurips-adaptive,
  title     = {{An Adaptive Metric Machine for Pattern Classification}},
  author    = {Domeniconi, Carlotta and Peng, Jing and Gunopulos, Dimitrios},
  booktitle = {Neural Information Processing Systems},
  year      = {2000},
  pages     = {458-464},
  url       = {https://mlanthology.org/neurips/2000/domeniconi2000neurips-adaptive/}
}