Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
Abstract
Although distance metric learning has been successfully applied to many real-world applications, learning a distance metric from large-scale and high-dimensional data remains a challenging problem. Due to the PSD constraint, the computational complexity of previous algorithms per iteration is at least O(d2) where d is the dimensionality of the data.In this paper, we develop an efficient stochastic algorithm for a class of distance metric learning problems with nuclear norm regularization, referred to as low-rank DML. By utilizing the low-rank structure of the intermediate solutions and stochastic gradients, the complexity of our algorithm has a linear dependence on the dimensionality d. The key idea is to maintain all the iterates in factorized representations and construct stochastic gradients that are low-rank. In this way, the projection onto the PSD cone can be implemented efficiently by incremental SVD. Experimental results on several data sets validate the effectiveness and efficiency of our method.
Cite
Text
Zhang and Zhang. "Efficient Stochastic Optimization for Low-Rank Distance Metric Learning." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.10649Markdown
[Zhang and Zhang. "Efficient Stochastic Optimization for Low-Rank Distance Metric Learning." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/zhang2017aaai-efficient-a/) doi:10.1609/AAAI.V31I1.10649BibTeX
@inproceedings{zhang2017aaai-efficient-a,
title = {{Efficient Stochastic Optimization for Low-Rank Distance Metric Learning}},
author = {Zhang, Jie and Zhang, Lijun},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2017},
pages = {933-940},
doi = {10.1609/AAAI.V31I1.10649},
url = {https://mlanthology.org/aaai/2017/zhang2017aaai-efficient-a/}
}