Classifying Facial Action
Abstract
The Facial Action Coding System, (FACS), devised by Ekman and Friesen (1978), provides an objective meanS for measuring the facial muscle contractions involved in a facial expression. In this paper, we approach automated facial expression analysis by detecting and classifying facial actions. We generated a database of over 1100 image sequences of 24 subjects performing over 150 distinct facial actions or action combinations. We compare three different ap(cid:173) proaches to classifying the facial actions in these images: Holistic spatial analysis based on principal components of graylevel images; explicit measurement of local image features such as wrinkles; and template matching with motion flow fields. On a dataset contain(cid:173) ing six individual actions and 20 subjects, these methods had 89%, 57%, and 85% performances respectively for generalization to novel subjects. When combined, performance improved to 92%.
Cite
Text
Bartlett et al. "Classifying Facial Action." Neural Information Processing Systems, 1995.Markdown
[Bartlett et al. "Classifying Facial Action." Neural Information Processing Systems, 1995.](https://mlanthology.org/neurips/1995/bartlett1995neurips-classifying/)BibTeX
@inproceedings{bartlett1995neurips-classifying,
title = {{Classifying Facial Action}},
author = {Bartlett, Marian Stewart and Viola, Paul A. and Sejnowski, Terrence J. and Golomb, Beatrice A. and Larsen, Jan and Hager, Joseph C. and Ekman, Paul},
booktitle = {Neural Information Processing Systems},
year = {1995},
pages = {823-829},
url = {https://mlanthology.org/neurips/1995/bartlett1995neurips-classifying/}
}