To Project More or to Quantize More: Minimize Reconstruction Bias for Learning Compact Binary Codes

Abstract

We present a novel approach called Minimal Reconstruction Bias Hashing (MRH) to learn similarity preserving binary codes that jointly optimize both projection and quantization stages. Our work tackles an important problem of how to elegantly connect optimizing projection with optimizing quantization, and to maximize the complementary effects of two stages. Distinct from previous works, MRH can adaptively adjust the projection dimensionality to balance the information loss between projection and quantization. It is formulated as a problem of minimizing reconstruction bias of compressed signals. Extensive experiment results have shown the proposed MRH significantly outperforms a variety of state-of-the-art methods over several widely used benchmarks. PDF

Cite

Text

Wang et al. "To Project More or to Quantize More: Minimize Reconstruction Bias for Learning Compact Binary Codes." International Joint Conference on Artificial Intelligence, 2016.

Markdown

[Wang et al. "To Project More or to Quantize More: Minimize Reconstruction Bias for Learning Compact Binary Codes." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/wang2016ijcai-project/)

BibTeX

@inproceedings{wang2016ijcai-project,
  title     = {{To Project More or to Quantize More: Minimize Reconstruction Bias for Learning Compact Binary Codes}},
  author    = {Wang, Zhe and Duan, Ling-Yu and Yuan, Junsong and Huang, Tiejun and Gao, Wen},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {2181-2188},
  url       = {https://mlanthology.org/ijcai/2016/wang2016ijcai-project/}
}