ComMU: Dataset for Combinatorial Music Generation
Abstract
Commercial adoption of automatic music composition requires the capability of generating diverse and high-quality music suitable for the desired context (e.g., music for romantic movies, action games, restaurants, etc.). In this paper, we introduce combinatorial music generation, a new task to create varying background music based on given conditions. Combinatorial music generation creates short samples of music with rich musical metadata, and combines them to produce a complete music. In addition, we introduce ComMU, the first symbolic music dataset consisting of short music samples and their corresponding 12 musical metadata for combinatorial music generation. Notable properties of ComMU are that (1) dataset is manually constructed by professional composers with an objective guideline that induces regularity, and (2) it has 12 musical metadata that embraces composers' intentions. Our results show that we can generate diverse high-quality music only with metadata, and that our unique metadata such as track-role and extended chord quality improves the capacity of the automatic composition. We highly recommend watching our video before reading the paper (https://pozalabs.github.io/ComMU/).
Cite
Text
Lee et al. "ComMU: Dataset for Combinatorial Music Generation." Neural Information Processing Systems, 2022.Markdown
[Lee et al. "ComMU: Dataset for Combinatorial Music Generation." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/lee2022neurips-commu/)BibTeX
@inproceedings{lee2022neurips-commu,
title = {{ComMU: Dataset for Combinatorial Music Generation}},
author = {Lee, Hyun and Kim, Taehyun and Kang, Hyolim and Ki, Minjoo and Hwang, Hyeonchan and Park, Kwanho and Han, Sharang and Kim, Seon Joo},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/lee2022neurips-commu/}
}