How Much Restricted Isometry Is Needed in Nonconvex Matrix Recovery?
Abstract
When the linear measurements of an instance of low-rank matrix recovery satisfy a restricted isometry property (RIP) --- i.e. they are approximately norm-preserving --- the problem is known to contain no spurious local minima, so exact recovery is guaranteed. In this paper, we show that moderate RIP is not enough to eliminate spurious local minima, so existing results can only hold for near-perfect RIP. In fact, counterexamples are ubiquitous: every $x$ is the spurious local minimum of a rank-1 instance of matrix recovery that satisfies RIP. One specific counterexample has RIP constant $\delta=1/2$, but causes randomly initialized stochastic gradient descent (SGD) to fail 12\% of the time. SGD is frequently able to avoid and escape spurious local minima, but this empirical result shows that it can occasionally be defeated by their existence. Hence, while exact recovery guarantees will likely require a proof of no spurious local minima, arguments based solely on norm preservation will only be applicable to a narrow set of nearly-isotropic instances.
Cite
Text
Zhang et al. "How Much Restricted Isometry Is Needed in Nonconvex Matrix Recovery?." Neural Information Processing Systems, 2018.Markdown
[Zhang et al. "How Much Restricted Isometry Is Needed in Nonconvex Matrix Recovery?." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/zhang2018neurips-much/)BibTeX
@inproceedings{zhang2018neurips-much,
title = {{How Much Restricted Isometry Is Needed in Nonconvex Matrix Recovery?}},
author = {Zhang, Richard and Josz, Cedric and Sojoudi, Somayeh and Lavaei, Javad},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {5586-5597},
url = {https://mlanthology.org/neurips/2018/zhang2018neurips-much/}
}