Visual mAP Matching and Localization Using a Global Feature mAP
Abstract
This paper presents a novel method to support environmental perception of mobile robots by the use of a global feature map. While typical approaches to simultaneous localization and mapping (SLAM) mainly rely on an on-board camera for mapping, our approach uses geographically referenced aerial or satellite images to build a map in advance. The current position on the map is determined by matching features from the on-board camera to the global feature map. The problem of feature matching is posed as a standard point pattern matching problem and a solution using the iterative closest point method is given.
Cite
Text
Pink. "Visual mAP Matching and Localization Using a Global Feature mAP." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2008. doi:10.1109/CVPRW.2008.4563135Markdown
[Pink. "Visual mAP Matching and Localization Using a Global Feature mAP." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2008.](https://mlanthology.org/cvprw/2008/pink2008cvprw-visual/) doi:10.1109/CVPRW.2008.4563135BibTeX
@inproceedings{pink2008cvprw-visual,
title = {{Visual mAP Matching and Localization Using a Global Feature mAP}},
author = {Pink, Oliver},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2008},
pages = {1-7},
doi = {10.1109/CVPRW.2008.4563135},
url = {https://mlanthology.org/cvprw/2008/pink2008cvprw-visual/}
}