Affective Behavior Analysis Using Task-Adaptive and AU-Assisted Graph
Abstract
In this work, we present our solution and experiment results for the Multi-Task Learning (MTL) Challenge of the 7th Affective Behavior Analysis in-the-wild (ABAW7) Competition. This challenge consists of three tasks: Action Unit (AU) detection, Facial Expression (EXPR) recognition, and Valance-Arousal (VA) estimation. We address above tasks from three aspects: 1) To learn robust facial feature representations, we first exploit the pre-trained large model DINOv2 to encode rich facial expressions; 2) We design a task-adaptive module (TAM) to learn the discriminative feature representations for each task in a self-adaptive manner. More specifically, we construct a set of learnable query vectors to capture task-specific representation via cross-attention learning; 3) We propose the AU-assisted Graph (AUG) module to capture the inherent correlation between AUs and then apply it to assist in solving the EXPR and VA tasks. As a result, our method achieves the performance of 1.2542 on the validation set and 1.1640 on the test set, ranking 4th place in the MTL Challenge.
Cite
Text
Li et al. "Affective Behavior Analysis Using Task-Adaptive and AU-Assisted Graph." European Conference on Computer Vision Workshops, 2024. doi:10.1007/978-3-031-91581-9_28Markdown
[Li et al. "Affective Behavior Analysis Using Task-Adaptive and AU-Assisted Graph." European Conference on Computer Vision Workshops, 2024.](https://mlanthology.org/eccvw/2024/li2024eccvw-affective/) doi:10.1007/978-3-031-91581-9_28BibTeX
@inproceedings{li2024eccvw-affective,
title = {{Affective Behavior Analysis Using Task-Adaptive and AU-Assisted Graph}},
author = {Li, Xiaodong and Du, Wenchao and Yang, Hongyu},
booktitle = {European Conference on Computer Vision Workshops},
year = {2024},
pages = {393-403},
doi = {10.1007/978-3-031-91581-9_28},
url = {https://mlanthology.org/eccvw/2024/li2024eccvw-affective/}
}