PointMamba: A Simple State Space Model for Point Cloud Analysis

Abstract

Transformers have become one of the foundational architectures in point cloud analysis tasks due to their excellent global modeling ability. However, the attention mechanism has quadratic complexity, making the design of a linear complexity method with global modeling appealing. In this paper, we propose PointMamba, transferring the success of Mamba, a recent representative state space model (SSM), from NLP to point cloud analysis tasks. Unlike traditional Transformers, PointMamba employs a linear complexity algorithm, presenting global modeling capacity while significantly reducing computational costs. Specifically, our method leverages space-filling curves for effective point tokenization and adopts an extremely simple, non-hierarchical Mamba encoder as the backbone. Comprehensive evaluations demonstrate that PointMamba achieves superior performance across multiple datasets while significantly reducing GPU memory usage and FLOPs. This work underscores the potential of SSMs in 3D vision-related tasks and presents a simple yet effective Mamba-based baseline for future research. The code is available at https://github.com/LMD0311/PointMamba.

Cite

Text

Liang et al. "PointMamba: A Simple State Space Model for Point Cloud Analysis." Neural Information Processing Systems, 2024. doi:10.52202/079017-1026

Markdown

[Liang et al. "PointMamba: A Simple State Space Model for Point Cloud Analysis." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/liang2024neurips-pointmamba/) doi:10.52202/079017-1026

BibTeX

@inproceedings{liang2024neurips-pointmamba,
  title     = {{PointMamba: A Simple State Space Model for Point Cloud Analysis}},
  author    = {Liang, Dingkang and Zhou, Xin and Xu, Wei and Zhu, Xingkui and Zou, Zhikang and Ye, Xiaoqing and Tan, Xiao and Bai, Xiang},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-1026},
  url       = {https://mlanthology.org/neurips/2024/liang2024neurips-pointmamba/}
}