Unsupervised Metric Learning with Synthetic Examples
Abstract
Distance Metric Learning (DML) involves learning an embedding that brings similar examples closer while moving away dissimilar ones. Existing DML approaches make use of class labels to generate constraints for metric learning. In this paper, we address the less-studied problem of learning a metric in an unsupervised manner. We do not make use of class labels, but use unlabeled data to generate adversarial, synthetic constraints for learning a metric inducing embedding. Being a measure of uncertainty, we minimize the entropy of a conditional probability to learn the metric. Our stochastic formulation scales well to large datasets, and performs competitive to existing metric learning methods.
Cite
Text
Dutta et al. "Unsupervised Metric Learning with Synthetic Examples." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.5795Markdown
[Dutta et al. "Unsupervised Metric Learning with Synthetic Examples." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/dutta2020aaai-unsupervised/) doi:10.1609/AAAI.V34I04.5795BibTeX
@inproceedings{dutta2020aaai-unsupervised,
title = {{Unsupervised Metric Learning with Synthetic Examples}},
author = {Dutta, Ujjal Kr and Harandi, Mehrtash and Sekhar, C. Chandra},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {3834-3841},
doi = {10.1609/AAAI.V34I04.5795},
url = {https://mlanthology.org/aaai/2020/dutta2020aaai-unsupervised/}
}