Provable Tensor Factorization with Missing Data

Abstract

We study the problem of low-rank tensor factorization in the presence of missing data. We ask the following question: how many sampled entries do we need, to efficiently and exactly reconstruct a tensor with a low-rank orthogonal decomposition? We propose a novel alternating minimization based method which iteratively refines estimates of the singular vectors. We show that under certain standard assumptions, our method can recover a three-mode $n\times n\times n$ dimensional rank-$r$ tensor exactly from $O(n^{3/2} r^5 \log^4 n)$ randomly sampled entries. In the process of proving this result, we solve two challenging sub-problems for tensors with missing data. First, in analyzing the initialization step, we prove a generalization of a celebrated result by Szemer\'edie et al. on the spectrum of random graphs. Next, we prove global convergence of alternating minimization with a good initialization. Simulations suggest that the dependence of the sample size on dimensionality $n$ is indeed tight.

Cite

Text

Jain and Oh. "Provable Tensor Factorization with Missing Data." Neural Information Processing Systems, 2014.

Markdown

[Jain and Oh. "Provable Tensor Factorization with Missing Data." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/jain2014neurips-provable/)

BibTeX

@inproceedings{jain2014neurips-provable,
  title     = {{Provable Tensor Factorization with Missing Data}},
  author    = {Jain, Prateek and Oh, Sewoong},
  booktitle = {Neural Information Processing Systems},
  year      = {2014},
  pages     = {1431-1439},
  url       = {https://mlanthology.org/neurips/2014/jain2014neurips-provable/}
}