Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures

Abstract

This paper presents a system for inferring complex mental states from video of facial expressions and head gestures in real-time. The system is based on a multi-level dynamic Bayesian network classifier which models complex mental states as a number of interacting facial and head displays, identified from component-based facial features. Experimental results for 6 mental states groups- agreement, concentrating, disagreement, interested, thinking and unsure are reported. Real-time performance, unobtrusiveness and lack of preprocessing make our system particularly suitable for user-independent human computer interaction.

Cite

Text

El Kaliouby and Robinson. "Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2004. doi:10.1109/CVPR.2004.427

Markdown

[El Kaliouby and Robinson. "Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2004.](https://mlanthology.org/cvprw/2004/kaliouby2004cvprw-realtime/) doi:10.1109/CVPR.2004.427

BibTeX

@inproceedings{kaliouby2004cvprw-realtime,
  title     = {{Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures}},
  author    = {El Kaliouby, Rana and Robinson, Peter},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2004},
  pages     = {154},
  doi       = {10.1109/CVPR.2004.427},
  url       = {https://mlanthology.org/cvprw/2004/kaliouby2004cvprw-realtime/}
}