WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments

Abstract

We present WildGS-SLAM, a robust and efficient monocular RGB SLAM system designed to handle dynamic environments by leveraging uncertainty-aware geometric mapping. Unlike traditional SLAM systems, which assume static scenes, our approach integrates depth and uncertainty information to enhance tracking, mapping, and rendering performance in the presence of moving objects. We introduce an uncertainty map, predicted by a shallow multi-layer perceptron and DINOv2 features, to guide dynamic object removal during both tracking and mapping.This uncertainty map enhances dense bundle adjustment and Gaussian map optimization, improving reconstruction accuracy. Our system is evaluated on multiple datasets and demonstrates artifact-free view synthesis. Results showcase WildGS-SLAM's superior performance in dynamic environments compared to state-of-the-art methods.

Cite

Text

Zheng et al. "WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.01070

Markdown

[Zheng et al. "WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/zheng2025cvpr-wildgsslam/) doi:10.1109/CVPR52734.2025.01070

BibTeX

@inproceedings{zheng2025cvpr-wildgsslam,
  title     = {{WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments}},
  author    = {Zheng, Jianhao and Zhu, Zihan and Bieri, Valentin and Pollefeys, Marc and Peng, Songyou and Armeni, Iro},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {11461-11471},
  doi       = {10.1109/CVPR52734.2025.01070},
  url       = {https://mlanthology.org/cvpr/2025/zheng2025cvpr-wildgsslam/}
}