Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures
Abstract
This paper presents a system for inferring complex mental states from video of facial expressions and head gestures in real-time. The system is based on a multi-level dynamic Bayesian network classifier which models complex mental states as a number of interacting facial and head displays, identified from component-based facial features. Experimental results for 6 mental states groups- agreement, concentrating, disagreement, interested, thinking and unsure are reported. Real-time performance, unobtrusiveness and lack of preprocessing make our system particularly suitable for user-independent human computer interaction.
Cite
Text
El Kaliouby and Robinson. "Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2004. doi:10.1109/CVPR.2004.427Markdown
[El Kaliouby and Robinson. "Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2004.](https://mlanthology.org/cvpr/2004/kaliouby2004cvpr-real/) doi:10.1109/CVPR.2004.427BibTeX
@inproceedings{kaliouby2004cvpr-real,
title = {{Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures}},
author = {El Kaliouby, Rana and Robinson, Peter},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2004},
pages = {154},
doi = {10.1109/CVPR.2004.427},
url = {https://mlanthology.org/cvpr/2004/kaliouby2004cvpr-real/}
}