Real-Time Body Tracking Using a Gaussian Process Latent Variable Model
Abstract
In this paper, we present a tracking framework for capturing articulated human motions in real-time, without the need for attaching markers onto the subject's body. This is achieved by first obtaining a low dimensional representation of the training motion data, using a nonlinear dimensionality reduction technique called back-constrained GPLVM. A prior dynamics model is then learnt from this low dimensional representation by partitioning the motion sequences into elementary movements using an unsupervised EM clustering algorithm. The temporal dependencies between these elementary movements are efficiently captured by a Variable Length Markov Model. The learnt dynamics model is used to bias the propagation of candidate pose feature vectors in the low dimensional space. By combining this with an efficient volumetric reconstruction algorithm, our framework can quickly evaluate each candidate pose against image evidence captured from multiple views. We present results that show our system can accurately track complex structured activities such as ballet dancing in real-time. ©2007 IEEE.
Cite
Text
Hou et al. "Real-Time Body Tracking Using a Gaussian Process Latent Variable Model." IEEE/CVF International Conference on Computer Vision, 2007. doi:10.1109/ICCV.2007.4408946Markdown
[Hou et al. "Real-Time Body Tracking Using a Gaussian Process Latent Variable Model." IEEE/CVF International Conference on Computer Vision, 2007.](https://mlanthology.org/iccv/2007/hou2007iccv-real/) doi:10.1109/ICCV.2007.4408946BibTeX
@inproceedings{hou2007iccv-real,
title = {{Real-Time Body Tracking Using a Gaussian Process Latent Variable Model}},
author = {Hou, Shaobo and Galata, Aphrodite and Caillette, Fabrice and Thacker, Neil A. and Bromiley, Paul A.},
booktitle = {IEEE/CVF International Conference on Computer Vision},
year = {2007},
pages = {1-8},
doi = {10.1109/ICCV.2007.4408946},
url = {https://mlanthology.org/iccv/2007/hou2007iccv-real/}
}