Fast Tensor Completion via Approximate Richardson Iteration
Abstract
We study tensor completion (TC) through the lens of low-rank tensor decomposition (TD). Many TD algorithms use fast alternating minimization methods to solve highly structured linear regression problems at each step (e.g., for CP, Tucker, and tensor-train decompositions). However, such algebraic structure is often lost in TC regression problems, making direct extensions unclear. This work proposes a novel lifting method for approximately solving TC regression problems using structured TD regression algorithms as blackbox subroutines, enabling sublinear-time methods. We analyze the convergence rate of our approximate Richardson iteration-based algorithm, and our empirical study shows that it can be 100x faster than direct methods for CP completion on real-world tensors.
Cite
Text
Ghadiri et al. "Fast Tensor Completion via Approximate Richardson Iteration." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Ghadiri et al. "Fast Tensor Completion via Approximate Richardson Iteration." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/ghadiri2025icml-fast/)BibTeX
@inproceedings{ghadiri2025icml-fast,
title = {{Fast Tensor Completion via Approximate Richardson Iteration}},
author = {Ghadiri, Mehrdad and Fahrbach, Matthew and Kook, Yunbum and Jadbabaie, Ali},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {19248-19265},
volume = {267},
url = {https://mlanthology.org/icml/2025/ghadiri2025icml-fast/}
}