A Random Matrix Approach to Low-Multilinear-Rank Tensor Approximation
Abstract
This work presents a comprehensive understanding of the estimation of a planted low-rank signal from a general spiked tensor model near the computational threshold. Relying on standard tools from the theory of large random matrices, we characterize the large-dimensional spectral behavior of the unfoldings of the data tensor and exhibit relevant signal-to-noise ratios governing the detectability of the principal directions of the signal. These results allow to accurately predict the reconstruction performance of truncated multilinear SVD (MLSVD) in the non-trivial regime. This is particularly important since it serves as an initialization of the higher-order orthogonal iteration (HOOI) scheme, whose convergence to the best low-multilinear-rank approximation depends entirely on its initialization. We give a sufficient condition for the convergence of HOOI and show that the number of iterations before convergence tends to $1$ in the large-dimensional limit.
Cite
Text
Lebeau et al. "A Random Matrix Approach to Low-Multilinear-Rank Tensor Approximation." Journal of Machine Learning Research, 2025.Markdown
[Lebeau et al. "A Random Matrix Approach to Low-Multilinear-Rank Tensor Approximation." Journal of Machine Learning Research, 2025.](https://mlanthology.org/jmlr/2025/lebeau2025jmlr-random/)BibTeX
@article{lebeau2025jmlr-random,
title = {{A Random Matrix Approach to Low-Multilinear-Rank Tensor Approximation}},
author = {Lebeau, Hugo and Chatelain, Florent and Couillet, Romain},
journal = {Journal of Machine Learning Research},
year = {2025},
pages = {1-64},
volume = {26},
url = {https://mlanthology.org/jmlr/2025/lebeau2025jmlr-random/}
}