Matrix Completion Has No Spurious Local Minimum
Abstract
Matrix completion is a basic machine learning problem that has wide applications, especially in collaborative filtering and recommender systems. Simple non-convex optimization algorithms are popular and effective in practice. Despite recent progress in proving various non-convex algorithms converge from a good initial point, it remains unclear why random or arbitrary initialization suffices in practice. We prove that the commonly used non-convex objective function for matrix completion has no spurious local minima \--- all local minima must also be global. Therefore, many popular optimization algorithms such as (stochastic) gradient descent can provably solve matrix completion with \textit{arbitrary} initialization in polynomial time.
Cite
Text
Ge et al. "Matrix Completion Has No Spurious Local Minimum." Neural Information Processing Systems, 2016.Markdown
[Ge et al. "Matrix Completion Has No Spurious Local Minimum." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/ge2016neurips-matrix/)BibTeX
@inproceedings{ge2016neurips-matrix,
title = {{Matrix Completion Has No Spurious Local Minimum}},
author = {Ge, Rong and Lee, Jason and Ma, Tengyu},
booktitle = {Neural Information Processing Systems},
year = {2016},
pages = {2973-2981},
url = {https://mlanthology.org/neurips/2016/ge2016neurips-matrix/}
}