Learning to Hash on Structured Data

Abstract

Hashing techniques have been widely applied for large scale similarity search problems due to the computational and memory efficiency.However, most existing hashing methods assume data examples are independently and identically distributed.But there often exists various additional dependency/structure information between data examplesin many real world applications. Ignoring this structure information may limit theperformance of existing hashing algorithms.This paper explores the research problemof learning to Hash on Structured Data (HSD) and formulates anovel framework that considers additional structure information.In particular, the hashing function is learned in a unified learning framework by simultaneously ensuring the structural consistency and preserving the similarities between data examples.An iterative gradient descent algorithm is designed as the optimization procedure. Furthermore, we improve the effectiveness of hashing function through orthogonal transformation by minimizing the quantization error.Experimentalresults on two datasets clearly demonstrate the advantages ofthe proposed method over several state-of-the-art hashing methods.

Cite

Text

Wang et al. "Learning to Hash on Structured Data." AAAI Conference on Artificial Intelligence, 2015. doi:10.1609/AAAI.V29I1.9557

Markdown

[Wang et al. "Learning to Hash on Structured Data." AAAI Conference on Artificial Intelligence, 2015.](https://mlanthology.org/aaai/2015/wang2015aaai-learning-b/) doi:10.1609/AAAI.V29I1.9557

BibTeX

@inproceedings{wang2015aaai-learning-b,
  title     = {{Learning to Hash on Structured Data}},
  author    = {Wang, Qifan and Si, Luo and Shen, Bin},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {3066-3072},
  doi       = {10.1609/AAAI.V29I1.9557},
  url       = {https://mlanthology.org/aaai/2015/wang2015aaai-learning-b/}
}