Tree Quantization for Large-Scale Similarity Search and Classification
Abstract
We propose a new vector encoding scheme (tree quantization) that obtains lossy compact codes for high-dimensional vectors via tree-based dynamic programming. Similarly to several previous schemes such as product quantization, these codes correspond to codeword numbers within multiple codebooks. We propose an integer programming-based optimization that jointly recovers the coding tree structure and the codebooks by minimizing the compression error on a training dataset. In the experiments with diverse visual descriptors (SIFT, neural codes, Fisher vectors), tree quantization is shown to combine fast encoding and state-of-the-art accuracy in terms of the compression error, the retrieval performance, and the image classification error.
Cite
Text
Babenko and Lempitsky. "Tree Quantization for Large-Scale Similarity Search and Classification." Conference on Computer Vision and Pattern Recognition, 2015. doi:10.1109/CVPR.2015.7299052Markdown
[Babenko and Lempitsky. "Tree Quantization for Large-Scale Similarity Search and Classification." Conference on Computer Vision and Pattern Recognition, 2015.](https://mlanthology.org/cvpr/2015/babenko2015cvpr-tree/) doi:10.1109/CVPR.2015.7299052BibTeX
@inproceedings{babenko2015cvpr-tree,
title = {{Tree Quantization for Large-Scale Similarity Search and Classification}},
author = {Babenko, Artem and Lempitsky, Victor},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2015},
doi = {10.1109/CVPR.2015.7299052},
url = {https://mlanthology.org/cvpr/2015/babenko2015cvpr-tree/}
}