Robust Iterative Quantization for Efficient ℓp-Norm Similarity Search
Abstract
Iterative Quantization (ITQ) is one of the most successful hashing based nearest-neighbor search methods for large-scale information retrieval in the past a few years due to its simplicity and superior performance. However, the performance of this algorithm degrades significantly when dealing with noisy data. Additionally, it can barely facilitate a wide range of applications as the distortion measurement only limits to ℓ2 norm. In this paper, we propose an ITQ+ algorithm, aiming to enhance both robustness and generalization of the original ITQ algorithm. Specifically, a ℓp, q-norm loss function is proposed to conduct the ℓp-norm similarity search, rather than a ℓ2} norm search. Despite the fact that changing the loss function to ℓp, q-norm makes our algorithm more robust and generic, it brings us a challenge that minimizes the obtained orthogonality constrained ℓp, q-norm function, which is non-smooth and non-convex. To solve this problem, we propose a novel and efficient optimization scheme. Extensive experiments on benchmark datasets demonstrate that ITQ+ is overwhelmingly better than the original ITQ algorithm, especially when searching similarity in noisy data. PDF
Cite
Text
Guo et al. "Robust Iterative Quantization for Efficient ℓp-Norm Similarity Search." International Joint Conference on Artificial Intelligence, 2016.Markdown
[Guo et al. "Robust Iterative Quantization for Efficient ℓp-Norm Similarity Search." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/guo2016ijcai-robust/)BibTeX
@inproceedings{guo2016ijcai-robust,
title = {{Robust Iterative Quantization for Efficient ℓp-Norm Similarity Search}},
author = {Guo, Yuchen and Ding, Guiguang and Han, Jungong and Jin, Xiaoming},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2016},
pages = {3382-3388},
url = {https://mlanthology.org/ijcai/2016/guo2016ijcai-robust/}
}