MambaVision: A Hybrid Mamba-Transformer Vision Backbone

Abstract

We propose a novel hybrid Mamba-Transformer backbone, MambaVision, specifically tailored for vision applications. Our core contribution includes redesigning the Mamba formulation to enhance its capability for efficient modeling of visual features. Through a comprehensive ablation study, we demonstrate the feasibility of integrating Vision Transformers (ViT) with Mamba. Our results show that equipping the Mamba architecture with self-attention blocks in the final layers greatly improves its capacity to capture long-range spatial dependencies. Based on these findings, we introduce a family of MambaVision models with a hierarchical architecture to meet various design criteria. For classification on the ImageNet-1K dataset, MambaVision variants achieve state-of-the-art (SOTA) performance in terms of both Top-1 accuracy and throughput. In downstream tasks such as object detection, instance segmentation, and semantic segmentation on MS COCO and ADE20K datasets, MambaVision outperforms comparably sized backbones while demonstrating favorable performance. Code: https://github.com/NVlabs/MambaVision

Cite

Text

Hatamizadeh and Kautz. "MambaVision: A Hybrid Mamba-Transformer Vision Backbone." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.02352

Markdown

[Hatamizadeh and Kautz. "MambaVision: A Hybrid Mamba-Transformer Vision Backbone." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/hatamizadeh2025cvpr-mambavision/) doi:10.1109/CVPR52734.2025.02352

BibTeX

@inproceedings{hatamizadeh2025cvpr-mambavision,
  title     = {{MambaVision: A Hybrid Mamba-Transformer Vision Backbone}},
  author    = {Hatamizadeh, Ali and Kautz, Jan},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {25261-25270},
  doi       = {10.1109/CVPR52734.2025.02352},
  url       = {https://mlanthology.org/cvpr/2025/hatamizadeh2025cvpr-mambavision/}
}