An Attention-Based Method for Multi-Label Facial Action Unit Detection

Abstract

Facial Action Coding System is an approach for modeling the complexity of human emotional expression. Automatic action unit (AU) detection is a crucial research area in human-computer interaction. This paper describes our submission to the third Affective Behavior Analysis in-the-wild (ABAW) competition 2022. We proposed a method for detecting facial action units in the video. In the first stage, a lightweight CNN-based feature extractor is employed to extract the feature map from each video frame. Then, an attention module is applied to refine the attention map. The attention encoded vector is derived using a weighted sum of the feature map and the attention scores later. Finally, the sigmoid function is used at the output layer to make the prediction suitable for multi-label AUs detection. We achieved a macro F1 score of 0.48 on the validation set and 0.4206 on the test set compared to 0.39 and 0.3650from the ABAW challenge baseline model.

Cite

Text

Le Hoai et al. "An Attention-Based Method for Multi-Label Facial Action Unit Detection." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2022. doi:10.1109/CVPRW56347.2022.00274

Markdown

[Le Hoai et al. "An Attention-Based Method for Multi-Label Facial Action Unit Detection." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2022.](https://mlanthology.org/cvprw/2022/hoai2022cvprw-attentionbased/) doi:10.1109/CVPRW56347.2022.00274

BibTeX

@inproceedings{hoai2022cvprw-attentionbased,
  title     = {{An Attention-Based Method for Multi-Label Facial Action Unit Detection}},
  author    = {Le Hoai, Duy and Lim, Eunchae and Choi, Eunbin and Kim, Sieun and Pant, Sudarshan and Lee, Guee-Sang and Kim, Soo-Hyung and Yang, Hyung-Jeong},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2022},
  pages     = {2453-2458},
  doi       = {10.1109/CVPRW56347.2022.00274},
  url       = {https://mlanthology.org/cvprw/2022/hoai2022cvprw-attentionbased/}
}