Low-Rank Regularization and Solution Uniqueness in Over-Parameterized Matrix Sensing
Abstract
We consider the question whether algorithmic choices in over-parameterized linear matrix factorization introduce implicit low-rank regularization.We focus on the noiseless matrix sensing scenario over low-rank positive semi-definite (PSD) matrices over the reals, with a sensing mechanism that satisfies restricted isometry properties.Surprisingly, it was recently argued that for recovery of PSD matrices, gradient descent over a squared, \textit{full-rank} factorized space introduces implicit low-rank regularization.Thus, a clever choice of the recovery algorithm avoids the need for explicit low-rank regularization. In this contribution, we prove that in fact, under certain conditions, the PSD constraint by itself is sufficient to lead to a unique low-rank matrix recovery, without explicit or implicit regularization.Therefore, under these conditions, the set of PSD matrices that are consistent with the observed data, is a singleton, regardless of the algorithm used. Our numerical study indicates that this result is general and extends to cases beyond the those covered by the proof.
Cite
Text
Geyer et al. "Low-Rank Regularization and Solution Uniqueness in Over-Parameterized Matrix Sensing." Artificial Intelligence and Statistics, 2020.Markdown
[Geyer et al. "Low-Rank Regularization and Solution Uniqueness in Over-Parameterized Matrix Sensing." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/geyer2020aistats-lowrank/)BibTeX
@inproceedings{geyer2020aistats-lowrank,
title = {{Low-Rank Regularization and Solution Uniqueness in Over-Parameterized Matrix Sensing}},
author = {Geyer, Kelly and Kyrillidis, Anastasios and Kalev, Amir},
booktitle = {Artificial Intelligence and Statistics},
year = {2020},
pages = {930-940},
volume = {108},
url = {https://mlanthology.org/aistats/2020/geyer2020aistats-lowrank/}
}