The Computational Limits of State-Space Models and Mamba via the Lens of Circuit Complexity

Abstract

In this paper, we analyze the computational limitations of Mamba and State-space Models (SSMs) by using the circuit complexity framework. Despite Mamba’s stateful design and recent attention as a strong candidate to outperform Transformers, we have demonstrated that both Mamba and SSMs with $\mathrm{poly}(n)$-precision and constant-depth layers reside within the $\mathsf{DLOGTIME}$-uniform $\mathsf{TC}^0$ complexity class. This result indicates Mamba has the same computational capabilities as Transformer theoretically, and it cannot solve problems like arithmetic formula problems, boolean formula value problems, and permutation composition problems if $\mathsf{TC}^0 \neq \mathsf{NC}^1$. Therefore, it challenges the assumption Mamba is more computationally expressive than Transformers. Our contributions include rigorous proofs showing that Selective SSM and Mamba architectures can be simulated by $\mathsf{DLOGTIME}$-uniform $\mathsf{TC}^0$ circuits, and they cannot solve problems outside $\mathsf{TC}^0$.

Cite

Text

Chen et al. "The Computational Limits of State-Space Models and Mamba via the Lens of Circuit Complexity." Conference on Parsimony and Learning, 2025.

Markdown

[Chen et al. "The Computational Limits of State-Space Models and Mamba via the Lens of Circuit Complexity." Conference on Parsimony and Learning, 2025.](https://mlanthology.org/cpal/2025/chen2025cpal-computational/)

BibTeX

@inproceedings{chen2025cpal-computational,
  title     = {{The Computational Limits of State-Space Models and Mamba via the Lens of Circuit Complexity}},
  author    = {Chen, Yifang and Li, Xiaoyu and Liang, Yingyu and Shi, Zhenmei and Song, Zhao},
  booktitle = {Conference on Parsimony and Learning},
  year      = {2025},
  pages     = {739-767},
  volume    = {280},
  url       = {https://mlanthology.org/cpal/2025/chen2025cpal-computational/}
}