Distance Metric Learning with Joint Representation Diversification
Abstract
Distance metric learning (DML) is to learn a representation space equipped with a metric, such that similar examples are closer than dissimilar examples concerning the metric. The recent success of DNNs motivates many DML losses that encourage the intra-class compactness and inter-class separability. The trade-off between inter-class compactness and inter-class separability shapes the DML representation space by determining how much information of the original inputs to retain. In this paper, we propose a Distance Metric Learning with Joint Representation Diversification (JRD) that allows a better balancing point between intra-class compactness and inter-class separability. Specifically, we propose a Joint Representation Similarity regularizer that captures different abstract levels of invariant features and diversifies the joint distributions of representations across multiple layers. Experiments on three deep DML benchmark datasets demonstrate the effectiveness of the proposed approach.
Cite
Text
Chu et al. "Distance Metric Learning with Joint Representation Diversification." International Conference on Machine Learning, 2020.Markdown
[Chu et al. "Distance Metric Learning with Joint Representation Diversification." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/chu2020icml-distance/)BibTeX
@inproceedings{chu2020icml-distance,
title = {{Distance Metric Learning with Joint Representation Diversification}},
author = {Chu, Xu and Lin, Yang and Wang, Yasha and Wang, Xiting and Yu, Hailong and Gao, Xin and Tong, Qi},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {1962-1973},
volume = {119},
url = {https://mlanthology.org/icml/2020/chu2020icml-distance/}
}