Nonlinear PHMMs for the Interpretation of Parameterized Gesture

Abstract

Recently we modified the hidden Markov model (HMM) framework to incorporate a global parametric variation in the output probabilities of the states of the HMM. Development of the parametric hidden Markov model (PHMM) was motivated by the task of simultaneously recognizing and interpreting gestures that exhibit meaningful variation. With standard HMMs, such global variation confounds the recognition process. The original PHMM approach assumes a linear dependence of output density means on the global parameter. In this paper we extend the PHMM to handle arbitrary smooth (nonlinear) dependencies. We show a generalized expectation-maximization (GEM) algorithm for training the PHMM and a GEM algorithm to simultaneously recognize the gesture and estimate the value of the parameter. We present results on a pointing gesture, where the nonlinear approach permits the natural azimuth/elevation parameterization of pointing direction.

Cite

Text

Wilson and Bobick. "Nonlinear PHMMs for the Interpretation of Parameterized Gesture." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1998. doi:10.1109/CVPR.1998.698708

Markdown

[Wilson and Bobick. "Nonlinear PHMMs for the Interpretation of Parameterized Gesture." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1998.](https://mlanthology.org/cvpr/1998/wilson1998cvpr-nonlinear/) doi:10.1109/CVPR.1998.698708

BibTeX

@inproceedings{wilson1998cvpr-nonlinear,
  title     = {{Nonlinear PHMMs for the Interpretation of Parameterized Gesture}},
  author    = {Wilson, Andrew D. and Bobick, Aaron F.},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {1998},
  pages     = {879-884},
  doi       = {10.1109/CVPR.1998.698708},
  url       = {https://mlanthology.org/cvpr/1998/wilson1998cvpr-nonlinear/}
}