Mamba Modulation: On the Length Generalization of Mamba Models
Abstract
The quadratic complexity of the attention mechanism in Transformer models has motivated the development of alternative architectures with sub-quadratic scaling, such as state-space models. Among these, Mamba has emerged as a leading architecture, achieving state-of-the-art results across a range of language modeling tasks. However, Mamba’s performance significantly deteriorates when applied to contexts longer than those seen during pre-training, revealing a sharp sensitivity to context length extension. Through detailed analysis, we attribute this limitation to the out-of-distribution behavior of its state-space dynamics, particularly within the parameterization of the state transition matrix $A$. Unlike recent works which attribute this sensitivity to the vanished accumulation of discretization time steps, $\exp(-\sum_{t=1}^N{\Delta}_t)$, we establish a connection between state convergence behavior as the input length approaches infinity and the spectrum of the transition matrix $A$, offering a well-founded explanation of its role in length extension. Next, to overcome this challenge, we propose an approach that applies spectrum scaling to pre-trained Mamba models to enable robust long-context generalization by selectively modulating the spectrum of $A$ matrices in each layer. We show that this can significantly improve performance in settings where simply modulating ${\Delta}_t$ fails, validating our insights and providing avenues for better length generalization of state-space models with structured transition matrices.
Cite
Text
Lu et al. "Mamba Modulation: On the Length Generalization of Mamba Models." Advances in Neural Information Processing Systems, 2025.Markdown
[Lu et al. "Mamba Modulation: On the Length Generalization of Mamba Models." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/lu2025neurips-mamba/)BibTeX
@inproceedings{lu2025neurips-mamba,
title = {{Mamba Modulation: On the Length Generalization of Mamba Models}},
author = {Lu, Peng and Huang, Jerry and Zeng, Qiuhao and Wang, Xinyu and Chen, Boxing and Langlais, Philippe and Cui, Yufei},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/lu2025neurips-mamba/}
}