Body-Relative Navigation Guidance Using Uncalibrated Cameras

Abstract

We present a vision-based method that assists human navigation within unfamiliar environments. Our main contribution is a novel algorithm that learns the correlation between user egomotion and feature matches on a wearable set of uncalibrated cameras. The primary advantage of this method is that it provides robust guidance cues in the user's body frame, and is tolerant to small changes in the camera configuration. We couple this method with a topological mapping algorithm that provides global localization within the traversed environment. We validate our approach with ground-truth experiments and demonstrate the method on several real-world datasets spanning two kilometers of indoor and outdoor walking excursions.

Cite

Text

Koch and Teller. "Body-Relative Navigation Guidance Using Uncalibrated Cameras." IEEE/CVF International Conference on Computer Vision, 2009. doi:10.1109/ICCV.2009.5459327

Markdown

[Koch and Teller. "Body-Relative Navigation Guidance Using Uncalibrated Cameras." IEEE/CVF International Conference on Computer Vision, 2009.](https://mlanthology.org/iccv/2009/koch2009iccv-body/) doi:10.1109/ICCV.2009.5459327

BibTeX

@inproceedings{koch2009iccv-body,
  title     = {{Body-Relative Navigation Guidance Using Uncalibrated Cameras}},
  author    = {Koch, Olivier and Teller, Seth J.},
  booktitle = {IEEE/CVF International Conference on Computer Vision},
  year      = {2009},
  pages     = {1242-1249},
  doi       = {10.1109/ICCV.2009.5459327},
  url       = {https://mlanthology.org/iccv/2009/koch2009iccv-body/}
}