Efficient Classification with Adaptive KNN

Abstract

In this paper, we propose an adaptive kNN method for classification, in which different k are selected for different test samples. Our selection rule is easy to implement since it is completely adaptive and does not require any knowledge of the underlying distribution. The convergence rate of the risk of this classifier to the Bayes risk is shown to be minimax optimal for various settings. Moreover, under some special assumptions, the convergence rate is especially fast and does not decay with the increase of dimensionality.

Cite

Text

Zhao and Lai. "Efficient Classification with Adaptive KNN." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I12.17314

Markdown

[Zhao and Lai. "Efficient Classification with Adaptive KNN." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/zhao2021aaai-efficient/) doi:10.1609/AAAI.V35I12.17314

BibTeX

@inproceedings{zhao2021aaai-efficient,
  title     = {{Efficient Classification with Adaptive KNN}},
  author    = {Zhao, Puning and Lai, Lifeng},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {11007-11014},
  doi       = {10.1609/AAAI.V35I12.17314},
  url       = {https://mlanthology.org/aaai/2021/zhao2021aaai-efficient/}
}