Sparse Compositional Metric Learning

Abstract

We propose a new approach for metric learning by framing it as learning a sparse combination of locally discriminative metrics that are inexpensive to generate from the training data. This flexible framework allows us to naturally derive formulations for global, multi-task and local metric learning. The resulting algorithms have several advantages over existing methods in the literature: a much smaller number of parameters to be estimated and a principled way to generalize learned metrics to new testing data points. To analyze the approach theoretically, we derive a generalization bound that justifies the sparse combination. Empirically, we evaluate our algorithms on several datasets against state-of-the-art metric learning methods. The results are consistent with our theoretical findings and demonstrate the superiority of our approach in terms of classification performance and scalability.

Cite

Text

Shi et al. "Sparse Compositional Metric Learning." AAAI Conference on Artificial Intelligence, 2014. doi:10.1609/AAAI.V28I1.8968

Markdown

[Shi et al. "Sparse Compositional Metric Learning." AAAI Conference on Artificial Intelligence, 2014.](https://mlanthology.org/aaai/2014/shi2014aaai-sparse/) doi:10.1609/AAAI.V28I1.8968

BibTeX

@inproceedings{shi2014aaai-sparse,
  title     = {{Sparse Compositional Metric Learning}},
  author    = {Shi, Yuan and Bellet, Aurélien and Sha, Fei},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2014},
  pages     = {2078-2084},
  doi       = {10.1609/AAAI.V28I1.8968},
  url       = {https://mlanthology.org/aaai/2014/shi2014aaai-sparse/}
}