Fast Classification with Binary Prototypes
Abstract
In this work, we propose a new technique for \emphfast k-nearest neighbor (k-NN) classification in which the original database is represented via a small set of learned binary prototypes. The training phase simultaneously learns a hash function which maps the data points to binary codes, and a set of representative binary prototypes. In the prediction phase, we first hash the query into a binary code and then do the k-NN classification using the binary prototypes as the database. Our approach speeds up k-NN classification in two aspects. First, we compress the database into a smaller set of prototypes such that k-NN search only goes through a smaller set rather than the whole dataset. Second, we reduce the original space to a compact binary embedding, where the Hamming distance between two binary codes is very efficient to compute. We propose a formulation to learn the hash function and prototypes such that the classification error is minimized. We also provide a novel theoretical analysis of the proposed technique in terms of Bayes error consistency. Empirically, our method is much faster than the state-of-the-art k-NN compression methods with comparable accuracy.
Cite
Text
Zhong et al. "Fast Classification with Binary Prototypes." International Conference on Artificial Intelligence and Statistics, 2017.Markdown
[Zhong et al. "Fast Classification with Binary Prototypes." International Conference on Artificial Intelligence and Statistics, 2017.](https://mlanthology.org/aistats/2017/zhong2017aistats-fast/)BibTeX
@inproceedings{zhong2017aistats-fast,
title = {{Fast Classification with Binary Prototypes}},
author = {Zhong, Kai and Guo, Ruiqi and Kumar, Sanjiv and Yan, Bowei and Simcha, David and Dhillon, Inderjit S.},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2017},
pages = {1255-1263},
url = {https://mlanthology.org/aistats/2017/zhong2017aistats-fast/}
}