Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data

Abstract

Tracking the articulated 3D motion of the hand has important applications, for example, in human-computer interaction and teleoperation. We present a novel method that can capture a broad range of articulated hand motions at interactive rates. Our hybrid approach combines, in a voting scheme, a discriminative, part-based pose retrieval method with a generative pose estimation method based on local optimization. Color information from a multiview RGB camera setup along with a person-specific hand model are used by the generative method to find the pose that best explains the observed images. In parallel, our discriminative pose estimation method uses fingertips detected on depth data to estimate a complete or partial pose of the hand by adopting a part-based pose retrieval strategy. This part-based strategy helps reduce the search space drastically in comparison to a global pose retrieval strategy. Quantitative results show that our method achieves state-of-the-art accuracy on challenging sequences and a near-realtime performance of 10 fps on a desktop computer.

Cite

Text

Sridhar et al. "Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data." International Conference on Computer Vision, 2013. doi:10.1109/ICCV.2013.305

Markdown

[Sridhar et al. "Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data." International Conference on Computer Vision, 2013.](https://mlanthology.org/iccv/2013/sridhar2013iccv-interactive/) doi:10.1109/ICCV.2013.305

BibTeX

@inproceedings{sridhar2013iccv-interactive,
  title     = {{Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data}},
  author    = {Sridhar, Srinath and Oulasvirta, Antti and Theobalt, Christian},
  booktitle = {International Conference on Computer Vision},
  year      = {2013},
  doi       = {10.1109/ICCV.2013.305},
  url       = {https://mlanthology.org/iccv/2013/sridhar2013iccv-interactive/}
}