Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan
Abstract
We revisit the sample and computational complexity of the rank-1 tensor completion problem in $\otimes_{i=1}^{N} \mathbb{R}^{d}$, given a uniformly sampled subset of entries. We present a characterization of the problem which reduces to solving a pair of random linear systems. For example, when $N$ is a constant, we prove it requires no more than $m = O(d^2 \log d)$ samples and runtime $O(md^2)$. Moreover, we show that a broad class of algorithms require $\Omega(d\log d)$ samples, even under higher rank scenarios. In contrast, existing upper bounds on the sample complexity are at least as large as $d^{1.5} \mu^{\Omega(1)} \log^{\Omega(1)} d$, where $\mu$ can be $\Theta(d)$ in the worst case. Prior work obtained these looser guarantees in higher rank versions of our problem, and tend to involve more complicated algorithms.
Cite
Text
Gomez-Leos and Lopez. "Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan." Transactions on Machine Learning Research, 2025.Markdown
[Gomez-Leos and Lopez. "Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/gomezleos2025tmlr-simple/)BibTeX
@article{gomezleos2025tmlr-simple,
title = {{Simple and Nearly-Optimal Sampling for Rank-1 Tensor Completion via Gauss-Jordan}},
author = {Gomez-Leos, Alejandro and Lopez, Oscar},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/gomezleos2025tmlr-simple/}
}