Interacting with a Pet Robot Using Hand Gestures

Abstract

t number of displaced pixels. The motion of the hand's centroid is tracked in real-time as new image frames are processed. Our system assumes that once the hand's velocity exceeds a certain threshold, the user has started a gesture. As the hand moves, the horizontal and vertical displacements (dx; dy) of the hand's centroid are stored in a feature vector until the hand pauses for 2-3 seconds. To recognize a gesture, we analyze the feature vector. For linear gestures, the (dx; dy) displacements cluster around fixed axes in the dx-dy plane: vertical gestures around the dy axis, horizontal gestures around the dx Copyright c fl 1999, American Association for Artificial Intelligence (www.aaai.org). All rights reserved. axis, and diagonal gestures around the two bisecting axes (45 ffi with respect to the dx-dy axes). The direction of motion is determined by the side of the axis (positive /negative) on which clustering occurs. For circular gestures, the centroid of these displacements coi

Cite

Text

Moy. "Interacting with a Pet Robot Using Hand Gestures." AAAI Conference on Artificial Intelligence, 1999.

Markdown

[Moy. "Interacting with a Pet Robot Using Hand Gestures." AAAI Conference on Artificial Intelligence, 1999.](https://mlanthology.org/aaai/1999/moy1999aaai-interacting/)

BibTeX

@inproceedings{moy1999aaai-interacting,
  title     = {{Interacting with a Pet Robot Using Hand Gestures}},
  author    = {Moy, Milyn C.},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {1999},
  pages     = {974},
  url       = {https://mlanthology.org/aaai/1999/moy1999aaai-interacting/}
}