An Embedded Solution to Visual Mapping for Consumer Drones

Abstract

In this paper, we propose a real-time visual mapping scheme which can be implemented on a low-cost embedded system for consumer-level ratio control (RC) drones. In our work, a 3-dimensional occupancy grid map is obtained based on an estimated trajectory from data fusion of multiple on-board sensors, composed of two downward-facing cameras, two forward-facing cameras, a GPS receiver, a magnetic compass and an inertial measurement unit (IMU) with 3-axis accelerometers and gyroscopes. Taking the advantages of the low-cost FPGA and ARM NEON intrinsics, we run our visual odometry and mapping algorithms at 10Hz on board. Meanwhile, we also present a hierarchical multi-sensor fusion algorithm to provide a robust trajectory for mapping usage. Finally, we verify the feasibility of our approaches and serval potential applications with experimental results in complex indoor/outdoor environments.

Cite

Text

Zhou et al. "An Embedded Solution to Visual Mapping for Consumer Drones." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2014. doi:10.1109/CVPRW.2014.102

Markdown

[Zhou et al. "An Embedded Solution to Visual Mapping for Consumer Drones." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2014.](https://mlanthology.org/cvprw/2014/zhou2014cvprw-embedded/) doi:10.1109/CVPRW.2014.102

BibTeX

@inproceedings{zhou2014cvprw-embedded,
  title     = {{An Embedded Solution to Visual Mapping for Consumer Drones}},
  author    = {Zhou, Guyue and Liu, Ang and Yang, Kang and Wang, Tao and Li, Zexiang},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2014},
  pages     = {670-675},
  doi       = {10.1109/CVPRW.2014.102},
  url       = {https://mlanthology.org/cvprw/2014/zhou2014cvprw-embedded/}
}