Real-Time Hand Tracking with Variable-Length Markov Models of Behaviour

Abstract

We present a novel approach for visual tracking of structured behaviour as observed in human-computer interaction. An automatically acquired variable-length Markov model is used to represent the high-level structure and temporal ordering of gestures. Continuous estimation of hand posture is handled by combining the model with annealed particle filtering. The stochastic simulation updates, and automatically switches between, different model representations of hand posture that correspond to distinct gestures. The implementation executes in real time and demonstrates significant improvement in robustness over comparable methods. We provide a measurement of user performance when our method is applied to a Fitts' law drag-anddrop task, and an analysis of the effects of latency that it introduces.

Cite

Text

Stefanov et al. "Real-Time Hand Tracking with Variable-Length Markov Models of Behaviour." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2005. doi:10.1109/CVPR.2005.518

Markdown

[Stefanov et al. "Real-Time Hand Tracking with Variable-Length Markov Models of Behaviour." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2005.](https://mlanthology.org/cvpr/2005/stefanov2005cvpr-real/) doi:10.1109/CVPR.2005.518

BibTeX

@inproceedings{stefanov2005cvpr-real,
  title     = {{Real-Time Hand Tracking with Variable-Length Markov Models of Behaviour}},
  author    = {Stefanov, Nikolay and Galata, Aphrodite and Hubbold, Roger J.},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2005},
  pages     = {73},
  doi       = {10.1109/CVPR.2005.518},
  url       = {https://mlanthology.org/cvpr/2005/stefanov2005cvpr-real/}
}