Durf project. The Sonification of Movements

Preview

The main idea of this project is to extract elements from the video capturing the user’s reactions when he/she listens to a piece of music, and adjust the music chord and rhythm with the movement elements to make the user feel in control of the body and the music. Technically, our project has two main parts, movement analysis and music generation. After a lot of research, reading, and practice, and with the help of professors, we successfully realized real-time analysis of movement emotion, as well as non-real-time analysis of movement rhythm and strength. In order to realize an integrated project, we finally decided to adopt all the non-real-time results as phased results. In terms of music, we adopted the EC2VAE model to adjust the chord, rhythm, and intensity of the original song. We use emotion as the input for the chord, and the output of visbeat3 model as the input of the rhythm. After inputting the two parameters, we would get a series of lists of numbers as the output music. We use Pygame to play the music flow. Until now, what we can achieve is that by inputting a video of movement with original music, we could get a new piece of music with adjusted chords, rhythm, and volume in accordance with the user’s movement.

flow_chart.png

This project right now is composed of three parts, emotion from movementrhythm from movement, and music generation with emotion and rhythm. (These have corresponding connection annotations in the GitHub folder)

Proposal

https://docs.google.com/document/d/1pZypFvfsMeO_A6QTaqJi5GEhRIU64SdpMPT4bYHvbB8/edit?usp=sharing

Github

https://github.com/Vivian-Xie/sonfying-movement

Report

https://docs.google.com/document/d/1UWMJvzkXlvoePs5MK1flbo-Kc2tsxMxp0oN3JaR5IA8/edit?usp=sharing

Leave a Reply

Your email address will not be published. Required fields are marked *