Deep Discrete Prototype Multilabel Learning
Abstract
kNN embedding methods, such as the state-of-the-art LM-kNN, have shown impressive results in multi-label learning. Unfortunately, these approaches suffer expensive computation and memory costs in large-scale settings. To fill this gap, this paper proposes a novel deep prototype compression, i.e., DBPC for fast multi-label prediction. DBPC compresses the database into a small set of short discrete prototypes, and uses the prototypes for prediction. The benefit of DBPC comes from two aspects: 1) The number of distance comparisons are reduced in the prototype; 2) The distance computation cost is significantly decreased in the reduced space. We propose to jointly learn the deep latent subspace and discrete prototypes within one framework. The encoding and decoding neural networks are employed to make deep discrete prototypes well represent the instances and labels. Extensive experiments on several large-scale datasets demonstrate that DBPC achieves several orders of magnitude lower storage and prediction complexity than state-of-the-art multi-label methods, while achieving competitive accuracy.
Cite
Text
Shen et al. "Deep Discrete Prototype Multilabel Learning." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/371Markdown
[Shen et al. "Deep Discrete Prototype Multilabel Learning." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/shen2018ijcai-deep/) doi:10.24963/IJCAI.2018/371BibTeX
@inproceedings{shen2018ijcai-deep,
title = {{Deep Discrete Prototype Multilabel Learning}},
author = {Shen, Xiaobo and Liu, Weiwei and Luo, Yong and Ong, Yew-Soon and Tsang, Ivor W.},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2018},
pages = {2675-2681},
doi = {10.24963/IJCAI.2018/371},
url = {https://mlanthology.org/ijcai/2018/shen2018ijcai-deep/}
}