General Low-Rank Matrix Optimization: Geometric Analysis and Sharper Bounds

Abstract

This paper considers the global geometry of general low-rank minimization problems via the Burer-Monterio factorization approach. For the rank-$1$ case, we prove that there is no spurious second-order critical point for both symmetric and asymmetric problems if the rank-$2$ RIP constant $\delta$ is less than $1/2$. Combining with a counterexample with $\delta=1/2$, we show that the derived bound is the sharpest possible. For the arbitrary rank-$r$ case, the same property is established when the rank-$2r$ RIP constant $\delta$ is at most $1/3$. We design a counterexample to show that the non-existence of spurious second-order critical points may not hold if $\delta$ is at least $1/2$. In addition, for any problem with $\delta$ between $1/3$ and $1/2$, we prove that all second-order critical points have a positive correlation to the ground truth. Finally, the strict saddle property, which can lead to the polynomial-time global convergence of various algorithms, is established for both the symmetric and asymmetric problems when the rank-$2r$ RIP constant $\delta$ is less than $1/3$. The results of this paper significantly extend several existing bounds in the literature.

Cite

Text

Zhang et al. "General Low-Rank Matrix Optimization: Geometric Analysis and Sharper Bounds." Neural Information Processing Systems, 2021.

Markdown

[Zhang et al. "General Low-Rank Matrix Optimization: Geometric Analysis and Sharper Bounds." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/zhang2021neurips-general/)

BibTeX

@inproceedings{zhang2021neurips-general,
  title     = {{General Low-Rank Matrix Optimization: Geometric Analysis and Sharper Bounds}},
  author    = {Zhang, Haixiang and Bi, Yingjie and Lavaei, Javad},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/zhang2021neurips-general/}
}