Scaled Gradients on Grassmann Manifolds for Matrix Completion
Abstract
This paper describes gradient methods based on a scaled metric on the Grassmann manifold for low-rank matrix completion. The proposed methods significantly improve canonical gradient methods especially on ill-conditioned matrices, while maintaining established global convegence and exact recovery guarantees. A connection between a form of subspace iteration for matrix completion and the scaled gradient descent procedure is also established. The proposed conjugate gradient method based on the scaled gradient outperforms several existing algorithms for matrix completion and is competitive with recently proposed methods.
Cite
Text
Ngo and Saad. "Scaled Gradients on Grassmann Manifolds for Matrix Completion." Neural Information Processing Systems, 2012.Markdown
[Ngo and Saad. "Scaled Gradients on Grassmann Manifolds for Matrix Completion." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/ngo2012neurips-scaled/)BibTeX
@inproceedings{ngo2012neurips-scaled,
title = {{Scaled Gradients on Grassmann Manifolds for Matrix Completion}},
author = {Ngo, Thanh and Saad, Yousef},
booktitle = {Neural Information Processing Systems},
year = {2012},
pages = {1412-1420},
url = {https://mlanthology.org/neurips/2012/ngo2012neurips-scaled/}
}