Gesture Recognition in Ego-Centric Videos Using Dense Trajectories and Hand Segmentation
Abstract
We present a novel method for monocular hand gesture recognition in ego-vision scenarios that deals with static and dynamic gestures and can achieve high accuracy results using a few positive samples. Specifically, we use and extend the dense trajectories approach that has been successfully introduced for action recognition. Dense features are extracted around regions selected by a new hand segmentation technique that integrates superpixel classification, temporal and spatial coherence. We extensively testour gesture recognition and segmentation algorithms on public datasets and propose a new dataset shot with a wearable camera. In addition, we demonstrate that our solution can work in near real-time on a wearable device.
Cite
Text
Baraldi et al. "Gesture Recognition in Ego-Centric Videos Using Dense Trajectories and Hand Segmentation." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2014. doi:10.1109/CVPRW.2014.107Markdown
[Baraldi et al. "Gesture Recognition in Ego-Centric Videos Using Dense Trajectories and Hand Segmentation." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2014.](https://mlanthology.org/cvprw/2014/baraldi2014cvprw-gesture/) doi:10.1109/CVPRW.2014.107BibTeX
@inproceedings{baraldi2014cvprw-gesture,
title = {{Gesture Recognition in Ego-Centric Videos Using Dense Trajectories and Hand Segmentation}},
author = {Baraldi, Lorenzo and Paci, Francesco and Serra, Giuseppe and Benini, Luca and Cucchiara, Rita},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2014},
pages = {702-707},
doi = {10.1109/CVPRW.2014.107},
url = {https://mlanthology.org/cvprw/2014/baraldi2014cvprw-gesture/}
}