Locally Optimized Product Quantization for Approximate Nearest Neighbor Search

Abstract

We present a simple vector quantizer that combines low distortion with fast search and apply it to approximate nearest neighbor (ANN) search in high dimensional spaces. Leveraging the very same data structure that is used to provide non-exhaustive search, i.e., inverted lists or a multi-index, the idea is to locally optimize an individual product quantizer (PQ) per cell and use it to encode residuals. Local optimization is over rotation and space decomposition; interestingly, we apply a parametric solution that assumes a normal distribution and is extremely fast to train. With a reasonable space and time overhead that is constant in the data size, we set a new state-of-the-art on several public datasets, including a billion-scale one.

Cite

Text

Kalantidis and Avrithis. "Locally Optimized Product Quantization for Approximate Nearest Neighbor Search." Conference on Computer Vision and Pattern Recognition, 2014. doi:10.1109/CVPR.2014.298

Markdown

[Kalantidis and Avrithis. "Locally Optimized Product Quantization for Approximate Nearest Neighbor Search." Conference on Computer Vision and Pattern Recognition, 2014.](https://mlanthology.org/cvpr/2014/kalantidis2014cvpr-locally/) doi:10.1109/CVPR.2014.298

BibTeX

@inproceedings{kalantidis2014cvpr-locally,
  title     = {{Locally Optimized Product Quantization for Approximate Nearest Neighbor Search}},
  author    = {Kalantidis, Yannis and Avrithis, Yannis},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2014},
  doi       = {10.1109/CVPR.2014.298},
  url       = {https://mlanthology.org/cvpr/2014/kalantidis2014cvpr-locally/}
}