Loopy-SLAM: Dense Neural SLAM with Loop Closures

Abstract

Neural RGBD SLAM techniques have shown promise in dense Simultaneous Localization And Mapping (SLAM) yet face challenges such as error accumulation during camera tracking resulting in distorted maps. In response we introduce Loopy-SLAM that globally optimizes poses and the dense 3D model. We use frame-to-model tracking using a data-driven point-based submap generation method and trigger loop closures online by performing global place recognition. Robust pose graph optimization is used to rigidly align the local submaps. As our representation is point based map corrections can be performed efficiently without the need to store the entire history of input frames used for mapping as typically required by methods employing a grid based mapping structure. Evaluation on the synthetic Replica and real-world TUM-RGBD and ScanNet datasets demonstrate competitive or superior performance in tracking mapping and rendering accuracy when compared to existing dense neural RGBD SLAM methods. Project page: notchla.github.io/Loopy-SLAM.

Cite

Text

Liso et al. "Loopy-SLAM: Dense Neural SLAM with Loop Closures." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.01925

Markdown

[Liso et al. "Loopy-SLAM: Dense Neural SLAM with Loop Closures." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/liso2024cvpr-loopyslam/) doi:10.1109/CVPR52733.2024.01925

BibTeX

@inproceedings{liso2024cvpr-loopyslam,
  title     = {{Loopy-SLAM: Dense Neural SLAM with Loop Closures}},
  author    = {Liso, Lorenzo and Sandström, Erik and Yugay, Vladimir and Van Gool, Luc and Oswald, Martin R.},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2024},
  pages     = {20363-20373},
  doi       = {10.1109/CVPR52733.2024.01925},
  url       = {https://mlanthology.org/cvpr/2024/liso2024cvpr-loopyslam/}
}