Query Adaptive Similarity for Large Scale Object Retrieval

Abstract

Many recent object retrieval systems rely on local features for describing an image. The similarity between a pair of images is measured by aggregating the similarity between their corresponding local features. In this paper we present a probabilistic framework for modeling the feature to feature similarity measure. We then derive a query adaptive distance which is appropriate for global similarity evaluation. Furthermore, we propose a function to score the individual contributions into an image to image similarity within the probabilistic framework. Experimental results show that our method improves the retrieval accuracy significantly and consistently. Moreover, our result compares favorably to the state-of-the-art.

Cite

Text

Qin et al. "Query Adaptive Similarity for Large Scale Object Retrieval." Conference on Computer Vision and Pattern Recognition, 2013. doi:10.1109/CVPR.2013.211

Markdown

[Qin et al. "Query Adaptive Similarity for Large Scale Object Retrieval." Conference on Computer Vision and Pattern Recognition, 2013.](https://mlanthology.org/cvpr/2013/qin2013cvpr-query/) doi:10.1109/CVPR.2013.211

BibTeX

@inproceedings{qin2013cvpr-query,
  title     = {{Query Adaptive Similarity for Large Scale Object Retrieval}},
  author    = {Qin, Danfeng and Wengert, Christian and Van Gool, Luc},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2013},
  doi       = {10.1109/CVPR.2013.211},
  url       = {https://mlanthology.org/cvpr/2013/qin2013cvpr-query/}
}