Hankel Singular Value Regularization for Highly Compressible State Space Models

Abstract

Deep neural networks using state space models as layers are well suited for long-range sequence tasks but can be challenging to compress after training. We use that regularizing the sum of Hankel singular values of state space models leads to a fast decay of these singular values and thus to compressible models. To make the proposed Hankel singular value regularization scalable, we develop an algorithm to efficiently compute the Hankel singular values during training iterations by exploiting the specific block-diagonal structure of the system matrices that we use in our state space model parametrization. Experiments on Long Range Arena benchmarks demonstrate that the regularized state space layers are up to 10$\times$ more compressible than standard state space layers while maintaining high accuracy.

Cite

Text

Schwerdtner et al. "Hankel Singular Value Regularization for Highly Compressible State Space Models." Advances in Neural Information Processing Systems, 2025.

Markdown

[Schwerdtner et al. "Hankel Singular Value Regularization for Highly Compressible State Space Models." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/schwerdtner2025neurips-hankel/)

BibTeX

@inproceedings{schwerdtner2025neurips-hankel,
  title     = {{Hankel Singular Value Regularization for Highly Compressible State Space Models}},
  author    = {Schwerdtner, Paul and Berman, Jules and Peherstorfer, Benjamin},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/schwerdtner2025neurips-hankel/}
}