Low-Rank Fully-Connected Tensor Network Learning for Tensor-on-Tensor Regression

Abstract

Fully-connected tensor network (FCTN) decomposition is a generalization of the popular tensor train and tensor ring decompositions. Based on this decomposition, we propose a general multi-linear tensor-on-tensor regression model. The alternating least squares (ALS) method is employed to obtain the ridge estimate, and the explicit structures of the coefficient matrices of the ALS subproblems are also explored. To figure out the structures, we adopt a tensor product called subnetwork product and adjust the sizes of the FCTN factors appropriately. Further, the above structures are utilized to accelerate the computation of Gramians and other terms needed in the ALS method, where the latter also invokes the leverage sampling. Extensive experiments on synthetic and real data are provided to illustrate the superior performances of our model and methods.

Cite

Text

Wang et al. "Low-Rank Fully-Connected Tensor Network Learning for Tensor-on-Tensor Regression." Machine Learning, 2026. doi:10.1007/S10994-025-06942-7

Markdown

[Wang et al. "Low-Rank Fully-Connected Tensor Network Learning for Tensor-on-Tensor Regression." Machine Learning, 2026.](https://mlanthology.org/mlj/2026/wang2026mlj-lowrank/) doi:10.1007/S10994-025-06942-7

BibTeX

@article{wang2026mlj-lowrank,
  title     = {{Low-Rank Fully-Connected Tensor Network Learning for Tensor-on-Tensor Regression}},
  author    = {Wang, Mengyu and Li, Fukang and Li, Hanyu},
  journal   = {Machine Learning},
  year      = {2026},
  pages     = {10},
  doi       = {10.1007/S10994-025-06942-7},
  volume    = {115},
  url       = {https://mlanthology.org/mlj/2026/wang2026mlj-lowrank/}
}