Predicting Perceived Music Emotions with Respect to Instrument Combinations
Abstract
Music Emotion Recognition has attracted a lot of academic research work in recent years because it has a wide range of applications, including song recommendation and music visualization. As music is a way for humans to express emotion, there is a need for a machine to automatically infer the perceived emotion of pieces of music. In this paper, we compare the accuracy difference between music emotion recognition models given music pieces as a whole versus music pieces separated by instruments. To compare the models' emotion predictions, which are distributions over valence and arousal values, we provide a metric that compares two distribution curves. Using this metric, we provide empirical evidence that training Random Forest and Convolution Recurrent Neural Network with mixed instrumental music data conveys a better understanding of emotion than training the same models with music that are separated into each instrumental source.
Cite
Text
Nguyen et al. "Predicting Perceived Music Emotions with Respect to Instrument Combinations." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I13.26910Markdown
[Nguyen et al. "Predicting Perceived Music Emotions with Respect to Instrument Combinations." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/nguyen2023aaai-predicting/) doi:10.1609/AAAI.V37I13.26910BibTeX
@inproceedings{nguyen2023aaai-predicting,
title = {{Predicting Perceived Music Emotions with Respect to Instrument Combinations}},
author = {Nguyen, Viet Dung and Nguyen, Quan H. and Freedman, Richard G.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {16078-16086},
doi = {10.1609/AAAI.V37I13.26910},
url = {https://mlanthology.org/aaai/2023/nguyen2023aaai-predicting/}
}