Code of Music A14: Melody Project – Alex Wang

Task:

Melody Project: Design and implement an interactive exploration of melody. Record a 20-second composition using it *.

Final Result:

Inspiration and concept:

I based my work on top of the previous melody sequencer that I made, My initial concept was to create a melody sequencer with the option of generating a random melody to inspire new ideas. However, without the use of Markov chains or neural nets, the melody generated does not sound very good. I spent a lot of time to refine some common music theory knowledge into the generation algorithm to make it less random, such as giving more chance of having the next note land closer to the previous note as well as greater chances of landing on a comfortable interval of the scale. The end result was ok but I decided to switch the focus of this project from randomly generating melodies to recognizing melodies, using an ML5 model called crepe.

Description of your main music and design decisions:

most of the design decisions made in this project is carried over from the melody sequencer sketch from the previous assignment, very clean and simple look with a dark background and light blue blocks. Some improvement from the melody sequencer is the addition of more buttons and features as well as minor design improvements such as showing the halftime bpm in parentheses when the current bpm is under a hundred. This is a design completely out of my personal preference because I like to program in halftime and do not consider music to be 70 bpm as opposed to 140 bpm, for example. 

Overview of how it works:

The melody sequencer itself works just like a regular melody sequencer you would expect in a daw, I created a two dimensional array with one dimension being time and the other being note pitch. 12 X 32, 12 available notes across a 32 beat sequence. 

For the pitch recognition I used a ML5 model called crepe, the model returns frequency from the mic input provided. I originally planned to utilize this model to directly draw on the melody sequencer, kind of like a recording to midi conversion program. But the process of cleaning up the information to a working interface requires a lot more work than I thought. I used multiple “filters” to clean up this information in order to have the program working stably.

avoiding jumps in frequency

the frequency values picked up by mics might be unstable even if there is a dominating sound in the audio input, picking up a single other tone in the audio can lead to a different frequency in a sequence of dominant frequencies. 

For example the pitch recognition value when A is played on the piano:

440,440,439,440,700,440,440

there may be a very different number in the sequence either because another sound is picked up, or a harmonic of the note being played.

I used a counter variable that checks whether the current frequency being played matches with the previous frequency, and increase the counter if they match. Only registering the note onto the sequencer once the counter reaches a certain value.

also limiting frequency values that exceeds the 12 note available in my interface also improves with performance.

avoiding fluctuations of pitch

I also made sure that the pitch does not has some flexibility to it, so a instrument can be accurately recorded even if it is slightly out of tune.

I achieved this by using a modulus equation finding the remainder of detected frequency with expected frequency.

(detected frequency: 443)%(expected frequency: 440) = 3

and as long as the fluctuation is between 10, I accept that as the same note.

setting input level:

Another improvement was to set a threshold of sound amplitude, to only take frequencies into consideration when the sound is louder than the threshold I set. So that only when playing instruments will the recording start, soft sounds that also goes into the mic will not affect the result. I also created a visual for this, showing the threshold and current loudness. With a color changing fill that goes from blue to green to red.

Challenges + future work:

This is probably one of my most unpolished projects so far, only because how hard it is to accurately track and record sounds and interpret them as midi. There should be more enhancements made so that the program can be functional enough for actual use as opposed to a tech demo, after that I would also like to explore the possibilities of combining this with more machine learning interaction. I envision a program similar to the duet shown in class, but opposed to keyboard input the program takes humming via pitch detection.

Code of Music A13: Melody Sequencer – Alex Wang

Task:

Create a Melody Sequencer. You can start from our drum machine, replace its players with a Sampler , and set each track to play a different pitch.

Final Result:

https://editor.p5js.org/alexwang/present/MrhvnpUOj

Process:

I started with my drum sequencer code, which was very unfortunately coded with different loops for each instrument: kick, snare, hats. I rewrote the code so that it now supports any amount of instruments, and replaced the drum kit with a sampler. 

Design:

I used a very simple but good color theme of light blue and black, other features includes a tempo slider and number grid for subdivision. I originally chose 12 available notes on the canvas because there are 12 notes in an octave, but I later decided to replace the chromatic with a major scale because it is easier to make pleasing melodies on a major scale as opposed to a chromatic. However, I did keep the 12 x 16 grid size simply because it seems to fit the screen well.

Code of Music A12: Melodic Interfaces – Alex Wang

Task:

sketch three different interfaces to interact with melody. You may draw inspiration from your interaction with existing interfaces, from the melody elements we covered in class, and from your experience drawing and listening to the piece you found engaging melodically.

1. Piano Roll/Sheet Music

My first thought on creating a music interface for melody is to go with a piano roll or sheet music style of interface, simply because this is the most common and effective for melody creation and is widely used for instrumental musicians and producers.

2. Instrument Based

Another possible interface design would be a instrument based design, giving users with instrument experience a easier time using this interface as well as giving non-experienced users the ability to pick up the actual instrument after using this interface. 

3. Abstract

The third design is less effective compared the first two ideas, a more abstract way of representing melody using interfaces such as my rhythm project. By generating simulated objects to represent sounds. Even though this might not be a professional way of creating melodies, but it could be interesting if the goal of this interface is not for serious music making

Code of Music A11: Melodic Song – Alex Wang

Task:

    1. Listen: Find a piece of music you find melodically engaging.
    2. Design: Draw the melody in your piece, using your own visual language

Song:

Winter Wind by Axitee is a remake of Chopin’s Etude Op 25. Axitee made an innovative twist of turning the song into a track with an oriental feel by utilizing the original melody by Chopin. The reason why I find this song suitable for the topic of melody is because the song uses an old melody in a new way, changing the feeling of the melody completely.

Visual Representation:

I decided to draw out the melody using the piano roll of a DAW, this visual representation shows the pitch and length relationships between notes used in the melody of winter wind, while the two highlighted lines shows the intervals that are avoided in a Japanese pentatonic scale. Even though Chopin was probably not intentionally trying to create aJapanese style melody, the melody can work when matched with instrumentations of Japanese music. Axitee completely changed the track by introducing elements of Japanese sounds and 808 drums. 

Code of Music A10: Rhythm Project – Alex Wang

Task:

Design and implement an interactive exploration of rhythm. 

Final Result:

Web Editor Link: https://editor.p5js.org/alexwang/present/pV11W3WM

Video:

Inspiration and concept:

As I was first brainstorming for the project I wanted to create some variation of a regular drum sequencer, however, I could not think of any design that I have not seen before. I am also in a game development class where I am working on object oriented programming, so I took the drum sequencer and applied it to floating objects on the canvas as opposed to blocks that does not move. Just like the drum sequencer, objects can be triggered to play a note at a given time. But what is different is that now all the notes are scrambled and traveling across the screen for more appealing visuals and randomness in pattern.

Description of your main music and design decisions:

I produce electronic music as a hobby so I decided to use one of my own songs for this project, this gives me the advantage of having access to all the independent stem files so I can isolate the sounds and play them in a different order than the original mix. This allowed interaction with the user, as the song will only progress if there is interaction. If the player decides to not interact with the program, the song will just loop the current patterns and not progress to more intensity. Some more design decisions includes the usage of color and size when creating the objects, I made the graphics more appealing by stacking layers of translucent ellipses and also adjusting the visibility of objects when they are close to triggered objects or the users mouse. A different color is given to each percussion instrument as well as different sizes depending on the step of the note(quarter notes and eighth notes are bigger), this guides the user to click less on sixteenth notes which is likely to produce a rhythm that sounds “off” without the accompaniment of notes on the downbeat.

Overview of how it works:

The project starts by displaying an intro animation of titles and quotes, after the user presses space bar the music will grow in intensity and drum beats will be introduced. The user interacts with objects on the screen by mouse clicking to create their own unique drum beat that fits the music, after a few cycles the song will end by stopping the drum beat and slowly fading both visuals and music away.

Challenges:

I think the greatest challenge when creating this project is to include interaction to a flowing music, to be able to progress the music only when the user interacts. I got around this by using a four on the floor dance track that progresses by adding more looping elements to the song, If I used a song of another genre the song structure will not allow me to simply add or remove loops but to change most of the elements in the music.

Future improvements:

I would like to make the visuals better by using 3D graphics, instead of emulating glow effects with translucent ellipses. Another possible improvement is to not have triggered objects change the visibility of other objects but have the ripples generated to light up other objects.