MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System

Abstract

The significant development of artificial neural network architectures has facilitated the increasing adoption of automated music composition models over the past few years. However, most existing systems feature algorithmic generative structures based on hard code and predefined rules, generally excluding interactive or improvised behaviors. We propose a motion based music system, MoMusic, as a AI real time music generation system. MoMusic features a partially randomized harmonic sequencing model based on a probabilistic analysis of tonal chord progressions, mathematically abstracted through musical set theory. This model is presented against a dual dimension grid that produces resulting sounds through a posture recognition mechanism. A camera captures the users' fingers' movement and trajectories, creating coherent, partially improvised harmonic progressions. MoMusic integrates several timbrical registers, from traditional classical instruments such as the piano to a new ''human voice instrument'' created using a voice conversion technique. Our research demonstrates MoMusic's interactiveness, ability to inspire musicians, and ability to generate coherent musical material with various timbrical registers. MoMusic's capabilities could be easily expanded to incorporate different forms of posture controlled timbrical transformation, rhythmic transformation, dynamic transformation, or even digital sound processing techniques.

Cite

Text

Bian et al. "MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I13.26907

Markdown

[Bian et al. "MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/bian2023aaai-momusic/) doi:10.1609/AAAI.V37I13.26907

BibTeX

@inproceedings{bian2023aaai-momusic,
  title     = {{MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System}},
  author    = {Bian, Weizhen and Song, Yijin and Gu, Nianzhen and Chan, Tin Yan and Lo, Tsz To and Li, Tsun Sun and Wong, King Chak and Xue, Wei and Trillo, Roberto Alonso},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {16057-16062},
  doi       = {10.1609/AAAI.V37I13.26907},
  url       = {https://mlanthology.org/aaai/2023/bian2023aaai-momusic/}
}