MoST: Efficient Monarch Sparse Tuning for 3D Representation Learning

Abstract

We introduce Monarch Sparse Tuning (MoST), the first reparameterization-based parameter-efficient fine-tuning (PEFT) method tailored for 3D representation learning. Unlike existing adapter-based and prompt-tuning 3D PEFT methods, MoST introduces no additional inference overhead and is compatible with many 3D representation learning backbones. At its core, we present a new family of structured matrices for 3D point clouds, Point Monarch, which can capture local geometric features of irregular points while offering high expressiveness. MoST reparameterizes the dense update weight matrices as our sparse Point Monarch matrices, significantly reducing parameters while retaining strong performance. Experiments on various backbones show that MoST is simple, effective, and highly generalizable. It captures local features in point clouds, achieving state-of-the-art results on multiple benchmarks, e.g., 97.5% acc. on ScanObjectNN (PB_50_RS) and 96.2% on ModelNet40 classification, while it can also combine with other matrix decompositions (e.g., Low-rank, Kronecker) to further reduce parameters.

Cite

Text

Han et al. "MoST: Efficient Monarch Sparse Tuning for 3D Representation Learning." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.00617

Markdown

[Han et al. "MoST: Efficient Monarch Sparse Tuning for 3D Representation Learning." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/han2025cvpr-most/) doi:10.1109/CVPR52734.2025.00617

BibTeX

@inproceedings{han2025cvpr-most,
  title     = {{MoST: Efficient Monarch Sparse Tuning for 3D Representation Learning}},
  author    = {Han, Xu and Tang, Yuan and Xu, Jinfeng and Li, Xianzhi},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {6584-6594},
  doi       = {10.1109/CVPR52734.2025.00617},
  url       = {https://mlanthology.org/cvpr/2025/han2025cvpr-most/}
}