Efficient Leverage Score Sampling for Tensor Train Decomposition
Abstract
Tensor Train~(TT) decomposition is widely used in the machine learning and quantum physics communities as a popular tool to efficiently compress high-dimensional tensor data. In this paper, we propose an efficient algorithm to accelerate computing the TT decomposition with the Alternating Least Squares (ALS) algorithm relying on exact leverage scores sampling. For this purpose, we propose a data structure that allows us to efficiently sample from the tensor with time complexity logarithmic in the product of the tensor dimensions. Our contribution specifically leverages the canonical form of the TT decomposition. By maintaining the canonical form through each iteration of ALS, we can efficiently compute (and sample from) the leverage scores, thus achieving significant speed-up in solving each sketched least-square problem. Experiments on synthetic and real data on dense and sparse tensors demonstrate that our method outperforms SVD-based and ALS-based algorithms.
Cite
Text
Bharadwaj et al. "Efficient Leverage Score Sampling for Tensor Train Decomposition." Neural Information Processing Systems, 2024. doi:10.52202/079017-2345Markdown
[Bharadwaj et al. "Efficient Leverage Score Sampling for Tensor Train Decomposition." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/bharadwaj2024neurips-efficient/) doi:10.52202/079017-2345BibTeX
@inproceedings{bharadwaj2024neurips-efficient,
title = {{Efficient Leverage Score Sampling for Tensor Train Decomposition}},
author = {Bharadwaj, Vivek and Rakhshan, Beheshteh T. and Malik, Osman Asif and Rabusseau, Guillaume},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-2345},
url = {https://mlanthology.org/neurips/2024/bharadwaj2024neurips-efficient/}
}