Fine Grained Pointing Recognition for Natural Drone Guidance

Abstract

Human action recognition systems are typically focused on identifying different actions, rather than fine grained variations of the same action. This work explores strategies to identify different pointing directions in order to build a natural interaction system to guide autonomous systems such as drones. Commanding a drone with hand-held panels or tablets is common practice but intuitive user-drone interfaces might have significant benefits. The system proposed in this work just requires the user to provide occasional high-level navigation commands by pointing the drone towards the desired motion direction. Due to the lack of data on these settings, we present a new benchmarking video dataset to validate our framework and facilitate future research on the area. Our results show good accuracy for pointing direction recognition, while running at interactive rates and exhibiting robustness to variability in user appearance, viewpoint, camera distance and scenery.

Cite

Text

Barbed et al. "Fine Grained Pointing Recognition for Natural Drone Guidance." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020. doi:10.1109/CVPRW50498.2020.00528

Markdown

[Barbed et al. "Fine Grained Pointing Recognition for Natural Drone Guidance." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020.](https://mlanthology.org/cvprw/2020/barbed2020cvprw-fine/) doi:10.1109/CVPRW50498.2020.00528

BibTeX

@inproceedings{barbed2020cvprw-fine,
  title     = {{Fine Grained Pointing Recognition for Natural Drone Guidance}},
  author    = {Barbed, O. L. and Azagra, Pablo and Teixeira, Lucas and Chli, Margarita and Civera, Javier and Murillo, Ana Cristina},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2020},
  pages     = {4480-4488},
  doi       = {10.1109/CVPRW50498.2020.00528},
  url       = {https://mlanthology.org/cvprw/2020/barbed2020cvprw-fine/}
}