Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity
Abstract
State-Space Models (SSMs), and particularly Mamba, have recently emerged as a promising alternative to Transformers. Mamba introduces input selectivity to its SSM layer (S6) and incorporates convolution and gating into its block definition. While these modifications do improve Mamba’s performance over its SSM predecessors, it remains largely unclear how Mamba leverages the additional functionalities provided by input selectivity, and how these interact with the other operations in the Mamba architecture. In this work, we demystify the role of input selectivity in Mamba, investigating its impact on function approximation power, long-term memorization, and associative recall capabilities. In particular: (i) we prove that the S6 layer of Mamba can represent projections onto Haar wavelets, providing an edge over its Diagonal SSM (S4D) predecessor in approximating discontinuous functions commonly arising in practice; (ii) we show how the S6 layer can dynamically counteract memory decay; (iii) we provide analytical solutions to the MQAR associative recall task using the Mamba architecture with different mixers — Mamba, Mamba-2, and S4D. We demonstrate the tightness of our theoretical constructions with empirical results on concrete tasks. Our findings offer a mechanistic understanding of Mamba and reveal opportunities for improvement.
Cite
Text
Huang et al. "Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Huang et al. "Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/huang2025icml-understanding/)BibTeX
@inproceedings{huang2025icml-understanding,
title = {{Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity}},
author = {Huang, Ningyuan Teresa and Sarabia, Miguel and Moudgil, Abhinav and Rodriguez, Pau and Zappella, Luca and Danieli, Federico},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {25693-25727},
volume = {267},
url = {https://mlanthology.org/icml/2025/huang2025icml-understanding/}
}