'Featuring' Optical Rails: View-Based Robot Guidance Using Orientation Features on the Sphere
Abstract
In this paper, we propose the extension of a view-based method for autonomous track following formerly introduced as Optical Rails by to vector-valued feature images. Instead of gray or color values, the whole analysis works on local orientations, extracted from omnidirectional input images and represented as low-frequency vector-valued view descriptors using spherical harmonics. New signal processing schemes on the sphere are presented for such view descriptors, allowing for efficient approximation, comparison, and differential motion estimation also for incomplete spherical signals. Track following is performed using only visual information, and no auxiliary guidance systems or odometry information is used. We present first results of track following with a mobile robot in an indoor environment, which demonstrates the feasibility of this novel approach.
Cite
Text
Dederscheck et al. "'Featuring' Optical Rails: View-Based Robot Guidance Using Orientation Features on the Sphere." IEEE/CVF International Conference on Computer Vision Workshops, 2009. doi:10.1109/ICCVW.2009.5457547Markdown
[Dederscheck et al. "'Featuring' Optical Rails: View-Based Robot Guidance Using Orientation Features on the Sphere." IEEE/CVF International Conference on Computer Vision Workshops, 2009.](https://mlanthology.org/iccvw/2009/dederscheck2009iccvw-featuring/) doi:10.1109/ICCVW.2009.5457547BibTeX
@inproceedings{dederscheck2009iccvw-featuring,
title = {{'Featuring' Optical Rails: View-Based Robot Guidance Using Orientation Features on the Sphere}},
author = {Dederscheck, David and Friedrich, Holger and Lenhart, Christine and Zahn, Martin and Mester, Rudolf},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2009},
pages = {2156-2163},
doi = {10.1109/ICCVW.2009.5457547},
url = {https://mlanthology.org/iccvw/2009/dederscheck2009iccvw-featuring/}
}