The AL$\ell_0$CORE Tensor Decomposition for Sparse Count Data

Abstract

This paper introduces AL$\ell_0$CORE, a new form of probabilistic non-negative tensor decomposition. AL$\ell_0$CORE is a Tucker decomposition that constrains the number of non-zero elements (i.e., the $\ell_0$-norm) of the core tensor to be at most $Q$. While the user dictates the total budget $Q$, the locations and values of the non-zero elements are latent variables allocated across the core tensor during inference. AL$\ell_0$CORE—i.e., allocated $\ell_0$-constrained core—thus enjoys both the computational tractability of canonical polyadic (CP) decomposition and the qualitatively appealing latent structure of Tucker. In a suite of real-data experiments, we demonstrate that AL$\ell_0$CORE typically requires only tiny fractions (e.g., 1%) of the core to achieve the same results as Tucker at a correspondingly small fraction of the cost.

Cite

Text

Hood and Schein. "The AL$\ell_0$CORE Tensor Decomposition for Sparse Count Data." Artificial Intelligence and Statistics, 2024.

Markdown

[Hood and Schein. "The AL$\ell_0$CORE Tensor Decomposition for Sparse Count Data." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/hood2024aistats-al/)

BibTeX

@inproceedings{hood2024aistats-al,
  title     = {{The AL$\ell_0$CORE Tensor Decomposition for Sparse Count Data}},
  author    = {Hood, John and Schein, Aaron J.},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2024},
  pages     = {4654-4662},
  volume    = {238},
  url       = {https://mlanthology.org/aistats/2024/hood2024aistats-al/}
}