Deep Multimodal Emotion Recognition Using Modality Aware Attention Network for Unifying Representations in Neural Models
Abstract
This paper introduces a multi-modal emotion recognition system aimed at enhancing emotion recognition by integrating representations from physiological signals. To accomplish this goal, we introduce a modality aware attention network to extract emotion-specific features by influencing and aligning the representation spaces of various modalities into a unified entity. Through a series of experiments and visualizations conducted on the AMIGO dataset, we demonstrate the efficacy of our proposed methodology for emotion classification, highlighting its capability to provide comprehensive representations of physiological signals.
Cite
Text
Woo et al. "Deep Multimodal Emotion Recognition Using Modality Aware Attention Network for Unifying Representations in Neural Models." NeurIPS 2023 Workshops: UniReps, 2023.Markdown
[Woo et al. "Deep Multimodal Emotion Recognition Using Modality Aware Attention Network for Unifying Representations in Neural Models." NeurIPS 2023 Workshops: UniReps, 2023.](https://mlanthology.org/neuripsw/2023/woo2023neuripsw-deep/)BibTeX
@inproceedings{woo2023neuripsw-deep,
title = {{Deep Multimodal Emotion Recognition Using Modality Aware Attention Network for Unifying Representations in Neural Models}},
author = {Woo, Sungpil and Zubair, Muhammad and Lim, Sunhwan and Kim, Daeyoung},
booktitle = {NeurIPS 2023 Workshops: UniReps},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/woo2023neuripsw-deep/}
}