Accelerated Inexact Soft-Impute for Fast Large-Scale Matrix Completion

Abstract

Matrix factorization tries to recover a low-rank matrix from limited observations. A state-of-the art algorithm is the Soft-Impute, which exploits a special “sparse plus low-rank” structure of the matrix iterates to allow efficient SVD in each iteration. Though Soft-Impute is also a proximal gradient algorithm, it is generally believed thatacceleration techniques are not useful and will destroy the special structure. In this paper, we show that Soft-Impute can indeed be accelerated without compromising the “sparse plus low-rank” structure. To further reduce the per-iteration time complexity, we propose an approximate singular value thresholding scheme based on the power method. Theoretical analysis shows that the proposed algorithm enjoys the fast O(1/T 2) convergence rate of accelerated proximal gradient algorithms. Extensive experiments on both synthetic and large recommendation data sets show that the proposed algorithm is much faster than Soft-Impute and other state-of-the-art matrix completion algorithms.

Cite

Text

Yao and Kwok. "Accelerated Inexact Soft-Impute for Fast Large-Scale Matrix Completion." International Joint Conference on Artificial Intelligence, 2015.

Markdown

[Yao and Kwok. "Accelerated Inexact Soft-Impute for Fast Large-Scale Matrix Completion." International Joint Conference on Artificial Intelligence, 2015.](https://mlanthology.org/ijcai/2015/yao2015ijcai-accelerated/)

BibTeX

@inproceedings{yao2015ijcai-accelerated,
  title     = {{Accelerated Inexact Soft-Impute for Fast Large-Scale Matrix Completion}},
  author    = {Yao, Quanming and Kwok, James T.},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {4002-4008},
  url       = {https://mlanthology.org/ijcai/2015/yao2015ijcai-accelerated/}
}