Midterm Concept

Title: Music Accompaniment

Inspiration: From the Project BachBot or AI Duet, they are used to generate music by machine. So instead of generating the following melody, I would want to try to generate the accompaniment from the given lead melody. And this has big potential since, in the future, people can use it for real-time accompaniment. The machine will generate a chord based on what the people sing.

Works to do: My midterm project will realize the function of generating the chords from the lead melody on a strict tempo. I will use the seq2seq model to process midi data. The midi files will be changed into text information by the pretty-midi library. 

HW2: Try with Feature Extractor Image Classification Model

For the assignment, I intended to make a physical controlled flappy bird. Instead of using a keyboard to control the bird’s movement, I want the users to self-define the commend, it can either be facial expressions, or different objects… So I chose the Feature Extractor Image Classification Model. On the ml5js reference page, it described this model ‘will allow you to train a neural network to distinguish between two different set of custom images’. This feature can perfectly realize the function of using different expressions or different objects to control the game.  Continue reading “HW2: Try with Feature Extractor Image Classification Model”

Week 2: Case Study Research – The BachBot

BachBot

“Can you tell the difference between Bach and a computer?” BachBot is an AI that makes music like Bach made by Feynman Liang. Their goal is to generate and harmonize chorales in a way that’s indistinguishable from Bach’s own work. This task requires both music theory and creativity. Here’s the link of BachBot, you can take the BachBot challenge to see if you can distinguish the difference between Bach’s excerpt and computer-generated melody. 

Continue reading “Week 2: Case Study Research – The BachBot”