Real-Time Visual SLAM with Resilience to Erratic Motion

Abstract

Simultaneous localisation and mapping using a single camera becomes difficult when erratic motions violate predictive motion models. This problem needs to be addressed when visual SLAM algorithms are transferred from robots or mobile vehicles onto hand-held or wearable devices. In this paper we describe a novel SLAM extension to a camera localisation algorithm based on particle filtering which provides resilience to erratic motion. The mapping component is based on auxiliary unscented Kalman filters coupled to the main particle filter via measurement covariances. This coupling allows the system to survive unpredictable motions such as camera shake, and enables a return to full SLAM operation once normal motion resumes. We present results demonstrating the effectiveness of the approach when operating within a desktop environment.

Cite

Text

Pupilli and Calway. "Real-Time Visual SLAM with Resilience to Erratic Motion." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2006. doi:10.1109/CVPR.2006.240

Markdown

[Pupilli and Calway. "Real-Time Visual SLAM with Resilience to Erratic Motion." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2006.](https://mlanthology.org/cvpr/2006/pupilli2006cvpr-real/) doi:10.1109/CVPR.2006.240

BibTeX

@inproceedings{pupilli2006cvpr-real,
  title     = {{Real-Time Visual SLAM with Resilience to Erratic Motion}},
  author    = {Pupilli, Mark and Calway, Andrew},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2006},
  pages     = {1244-1249},
  doi       = {10.1109/CVPR.2006.240},
  url       = {https://mlanthology.org/cvpr/2006/pupilli2006cvpr-real/}
}