Classifying Hand Gestures with a View-Based Distributed Representation
Abstract
We present a method for learning, tracking, and recognizing human hand gestures recorded by a conventional CCD camera without any special gloves or other sensors. A view-based representation is used to model aspects of the hand relevant to the trained gestures, and is found using an unsupervised clustering technique. We use normalized correlation net(cid:173) works, with dynamic time warping in the temporal domain, as a distance function for unsupervised clustering. Views are computed separably for space and time dimensions; the distributed response of the combination of these units characterizes the input data with a low dimensional repre(cid:173) sentation. A supervised classification stage uses labeled outputs of the spatio-temporal units as training data. Our system can correctly classify gestures in real time with a low-cost image processing accelerator.
Cite
Text
Darrell and Pentland. "Classifying Hand Gestures with a View-Based Distributed Representation." Neural Information Processing Systems, 1993.Markdown
[Darrell and Pentland. "Classifying Hand Gestures with a View-Based Distributed Representation." Neural Information Processing Systems, 1993.](https://mlanthology.org/neurips/1993/darrell1993neurips-classifying/)BibTeX
@inproceedings{darrell1993neurips-classifying,
title = {{Classifying Hand Gestures with a View-Based Distributed Representation}},
author = {Darrell, Trevor J. and Pentland, Alex P.},
booktitle = {Neural Information Processing Systems},
year = {1993},
pages = {945-952},
url = {https://mlanthology.org/neurips/1993/darrell1993neurips-classifying/}
}