Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements

Abstract

We investigate the sample complexity of recovering tensors with low symmetric rank from sym- metric rank-one measurements, a setting particularly motivated by the study of higher-order inter- actions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved proba- bility calculations. To address these challenges, we employ the Carbery-Wright inequality, a power- ful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano’s inequality, and discuss broader implications of our results for two-layer polynomial networks.

Cite

Text

Kızıldağ. "Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.

Markdown

[Kızıldağ. "Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.](https://mlanthology.org/alt/2025/kzldag2025alt-informationtheoretic/)

BibTeX

@inproceedings{kzldag2025alt-informationtheoretic,
  title     = {{Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements}},
  author    = {Kızıldağ, Eren C.},
  booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory},
  year      = {2025},
  pages     = {649-652},
  volume    = {272},
  url       = {https://mlanthology.org/alt/2025/kzldag2025alt-informationtheoretic/}
}