A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices

Abstract

We propose a general and efficient algorithm for learning low-rank matrices. The proposed algorithm converges super-linearly and can keep the matrix to be learned in a compact factorized representation without the need of specifying the rank beforehand. Moreover, we show that the framework can be easily generalized to the problem of learning multiple matrices and general spectral regularization. Empirically we show that we can recover a 10,000x10,000 matrix from 1.2 million observations in about 5 minutes. Furthermore, we show that in a brain-computer interface problem, the proposed method can speed-up the optimization by two orders of magnitude against the conventional projected gradient method and produces more reliable solutions.

Cite

Text

Tomioka et al. "A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices." International Conference on Machine Learning, 2010.

Markdown

[Tomioka et al. "A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices." International Conference on Machine Learning, 2010.](https://mlanthology.org/icml/2010/tomioka2010icml-fast/)

BibTeX

@inproceedings{tomioka2010icml-fast,
  title     = {{A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices}},
  author    = {Tomioka, Ryota and Suzuki, Taiji and Sugiyama, Masashi and Kashima, Hisashi},
  booktitle = {International Conference on Machine Learning},
  year      = {2010},
  pages     = {1087-1094},
  url       = {https://mlanthology.org/icml/2010/tomioka2010icml-fast/}
}