Fast Low-Rank Metric Learning for Large-Scale and High-Dimensional Data

Abstract

Low-rank metric learning aims to learn better discrimination of data subject to low-rank constraints. It keeps the intrinsic low-rank structure of datasets and reduces the time cost and memory usage in metric learning. However, it is still a challenge for current methods to handle datasets with both high dimensions and large numbers of samples. To address this issue, we present a novel fast low-rank metric learning (FLRML) method. FLRML casts the low-rank metric learning problem into an unconstrained optimization on the Stiefel manifold, which can be efficiently solved by searching along the descent curves of the manifold. FLRML significantly reduces the complexity and memory usage in optimization, which makes the method scalable to both high dimensions and large numbers of samples. Furthermore, we introduce a mini-batch version of FLRML to make the method scalable to larger datasets which are hard to be loaded and decomposed in limited memory. The outperforming experimental results show that our method is with high accuracy and much faster than the state-of-the-art methods under several benchmarks with large numbers of high-dimensional data. Code has been made available at https://github.com/highan911/FLRML.

Cite

Text

Liu et al. "Fast Low-Rank Metric Learning for Large-Scale and High-Dimensional Data." Neural Information Processing Systems, 2019.

Markdown

[Liu et al. "Fast Low-Rank Metric Learning for Large-Scale and High-Dimensional Data." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/liu2019neurips-fast/)

BibTeX

@inproceedings{liu2019neurips-fast,
  title     = {{Fast Low-Rank Metric Learning for Large-Scale and High-Dimensional Data}},
  author    = {Liu, Han and Han, Zhizhong and Liu, Yu-Shen and Gu, Ming},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {819-829},
  url       = {https://mlanthology.org/neurips/2019/liu2019neurips-fast/}
}