Nonconvex Low-Rank Tensor Completion from Noisy Data

Abstract

We study a completion problem of broad practical interest: the reconstruction of a low-rank symmetric tensor from highly incomplete and randomly corrupted observations of its entries. While a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications, or come with sub-optimal statistical guarantees. Focusing on ``incoherent'' and well-conditioned tensors of a constant CP rank, we propose a two-stage nonconvex algorithm --- (vanilla) gradient descent following a rough initialization --- that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves all low-rank tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e.~minimal sample complexity and optimal $\ell_2$ and $\ell_{\infty}$ statistical accuracy). The insights conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.

Cite

Text

Cai et al. "Nonconvex Low-Rank Tensor Completion from Noisy Data." Neural Information Processing Systems, 2019.

Markdown

[Cai et al. "Nonconvex Low-Rank Tensor Completion from Noisy Data." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/cai2019neurips-nonconvex/)

BibTeX

@inproceedings{cai2019neurips-nonconvex,
  title     = {{Nonconvex Low-Rank Tensor Completion from Noisy Data}},
  author    = {Cai, Changxiao and Li, Gen and Poor, H. Vincent and Chen, Yuxin},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {1863-1874},
  url       = {https://mlanthology.org/neurips/2019/cai2019neurips-nonconvex/}
}