Realtime and Robust Hand Tracking from Depth
Abstract
We present a realtime hand tracking system using a depth sensor. It tracks a fully articulated hand under large viewpoints in realtime (25 FPS on a desktop without using a GPU) and with high accuracy (error below 10 mm). To our knowledge, it is the first system that achieves such robustness, accuracy, and speed simultaneously, as verified on challenging real data. Our system is made of several novel techniques. We model a hand simply using a number of spheres and define a fast cost function. Those are critical for realtime performance. We propose a hybrid method that combines gradient based and stochastic optimization methods to achieve fast convergence and good accuracy. We present new finger detection and hand initialization methods that greatly enhance the robustness of tracking.
Cite
Text
Qian et al. "Realtime and Robust Hand Tracking from Depth." Conference on Computer Vision and Pattern Recognition, 2014. doi:10.1109/CVPR.2014.145Markdown
[Qian et al. "Realtime and Robust Hand Tracking from Depth." Conference on Computer Vision and Pattern Recognition, 2014.](https://mlanthology.org/cvpr/2014/qian2014cvpr-realtime/) doi:10.1109/CVPR.2014.145BibTeX
@inproceedings{qian2014cvpr-realtime,
title = {{Realtime and Robust Hand Tracking from Depth}},
author = {Qian, Chen and Sun, Xiao and Wei, Yichen and Tang, Xiaoou and Sun, Jian},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2014},
doi = {10.1109/CVPR.2014.145},
url = {https://mlanthology.org/cvpr/2014/qian2014cvpr-realtime/}
}