Global Optimality of Local Search for Low Rank Matrix Recovery
Abstract
We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrix recovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to a global optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergence guarantee for stochastic gradient descent {\em from random initialization}.
Cite
Text
Bhojanapalli et al. "Global Optimality of Local Search for Low Rank Matrix Recovery." Neural Information Processing Systems, 2016.Markdown
[Bhojanapalli et al. "Global Optimality of Local Search for Low Rank Matrix Recovery." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/bhojanapalli2016neurips-global/)BibTeX
@inproceedings{bhojanapalli2016neurips-global,
title = {{Global Optimality of Local Search for Low Rank Matrix Recovery}},
author = {Bhojanapalli, Srinadh and Neyshabur, Behnam and Srebro, Nati},
booktitle = {Neural Information Processing Systems},
year = {2016},
pages = {3873-3881},
url = {https://mlanthology.org/neurips/2016/bhojanapalli2016neurips-global/}
}