Code of Music Final Project – Thomas Tai

Demo: http://thomastai.com/space/

Introduction

For our final project, we were asked to create a generative / interactive musical piece that is web based, physical, or exists in space. Since we do not have access to any physical equipment, I decided to make a generative music piece that anyone can access on the internet. The inspiration from this interface came from Brian Eno’s Music for Airports, a generative piece released in 1978.

What I found amazing was that this existed prior to the invention of digital audio workstations, and was made by layering tape loops of varying lengths to produce an infinitely generating piece.  I took some of the notes from his piece that I found online and used loops to create my own interface. Using the NASA API I was able to pull data from nearby asteroids and use it to generate a unique loop for any specific date.

Process

I wanted to work with the concept of space for this project, so I found some neat CSS transformations that allowed me to blink stars and simulate an Earth rotation effect without a 3D graphics library. The Earth rotation is done by transforming a flat map of the Earth into a circular div. The code for that can be found here. I may try to explore a different route in the future to create more realistic simulations. By moving a background with circles we can simulate a twinkling effect. 

Next I implemented the REST API calls using a promise, which is essentially a callback function that returns data when it is available. Here is a sample JSON file that is returned back from the server:

I then iterate through the asteroids and create a new Tone.JS loop which will return a callback every few meters depending on the current tempo. The user can change this tempo using the forwards and backwards buttons on the browser. The Tone.JS player will play a sound file when the loop time has arrived. In addition a random note will be played using a Poly Synth that I created. Another global loop will play sound files from NASA that consists of radio signals caused by plasma waves found in the solar system. However, this plays at random intervals with a probability of 20% for every 10 meters. 

Future Work

If I had more time, I would love to create a more interactive interface, as the only thing that you can do is watch asteroids come by and change the bpm. I have several issues that I would like to figure out in the future, including some graphical glitches. Overall, I had a great time designing this interface and learning more about simple generative music techniques. Thanks for a great semester!

Code of Music Week 13 Harmony Project – Thomas Tai

Project Link: http://thomastai.com/harmony/

Introduction: 

For this week’s assignment, we were asked to design and implement an exploration of harmony using code which could be an interactive piece or a fixed composition. I decided to continue developing and improving the melody project that I created a couple weeks ago.  Based on user feedback from the class, I noted some issues that people had. Someone said the interface was confusing since the chords would change even when the same button was pressed. Another thing was that the notes that were being shown on the screen did not reflect the actual notes being played. 

The overall interface is the same, with some improvements for clarity and usability. Now, instead of using a machine learning library to produce chords, the user has the ability to choose minor, major, diminished, and augmented chords from the menu. In addition, the user can choose from an octave of notes and multiple stringed instruments. Notes actually reflect the pitch of the note that is being played: the color of the note corresponds to the color of the chord buttons below.  I believe this new interface is more intuitive and has increased interactivity. 

Process:

I started with the interface that I already had from the previous week and made some changes in the code. Most notably, I removed network and machine learning capability which may be added again if there is a proper reason to. From my previous presentation, I found that people playing notes in a group setting just created dissonance.

While it is cool, it does not suit the interface. I made responsive design changes to make the width of the interface scale depending on the window size. Last time, I had refreshed the page when the browser was resized which was definitely not the right way to create a responsive website.  Resizing the height is more complicated and needs an overhaul which I was not able to complete in time.  

I added buttons for each of the 12 chords in the scale, and added text next to each button. I would consider changing the style of the buttons in the future with more testing. The. animation of the notes was changed in the CSS to be more visually appealing and smooth. Rather than selecting a random height, a calculate height from the bottom of the container is used. I added the different chords as an array which can be selected by the user: [[12,7,4,0],[12,7,3,0],[12,8,4,0],[12,6,3,0]]. In addition, I replaced all the note files with better sounding ones and added vibrato as a digital effect in Tone.JS. There is a noticeable difference in quality.

Future Development:

I would love to continue working on this project in the future to make it easier to use and add more features to the interface. There are still some bugs that need to be fix and responsive design problems that definitely cause issues on mobile devices and other operating systems. After presenting my project to the class, I hope to gain more feedback on what needs to be changed. I think the interface is aesthetically pleasing and intuitive for new users, but the project needs more testing and validation to improve.

Code of Music Week 12: Timbre Project – Thomas Tai

For this week’s assignment, we were asked to design two different timbres, applying synthesis and sampling techniques of our choice. 

Violin Synth: https://editor.p5js.org/thomastai/sketches/lLCzd4Xog

Poly Synth: https://editor.p5js.org/thomastai/sketches/eZJjwTD4g

I took samples from the London Philharmonic orchestra and used them to create a sample with a vibrato effect applied. Using the code from class, I used the waveform analyzer to view the sound wave on the editor. It was interesting to see how the effects affected the timbre of the note played. 

For my second synthesizer, I used the PolySynth from Tone.js, which uses a specified synthesizer to create a variety of voices. I then applied bit crusher affect which produces distortion, giving it an 8-bit feel. I also applied a chorus effect that modulates between a left and right delay. Next, I applied the autowah effect and connected all the effects to the master output. Each note is accompanied by a circle that fades away over time. If I had more time, I would try to make the effects sound better and create a better visual representation of the synthesizer.

Code of Music Week 11 – Thomas Tai

Designing Sounds 

Other than differences in intensity, pitch, and tone, there is a difference in timbre. I noticed more harmonic components to the flute, which was mapped on the spectrogram by multiple bumps after the fundamental frequency. However, the whistle sound was more concentrated at one single frequency which I found interesting.  The trombone had more waves at frequencies that were spread out over the entire spectrogram.

However, the flute had more harmonics centered on its fundamental frequency. Also the attack and sustain of the note were different due to the instrument design. The drums had a quick attack and release, and had few harmonics attached to it. The spectrogram showed that the noise was centered around lower frequency, but more spread out than the flute or trombone.

Humming was similar to the flute, whereas clapping produced waves that were similar to the drum machine. As I sung higher notes, I could see the spectrogram show frequencies that were higher. It was interesting to see that our voice is not a perfect sine wave but rather has harmonic components like any other instruments.

Piece of music that is interesting in terms of timbre

I found this piece interesting because it used cups of water to create notes. In the original piece, they use an instrument called a celesta. When the strings are hit by hammers, they produce a soft, bell-like sound. If you didn’t see the video, you probably wouldn’t be able to guess what produced the sound. We can create almost identical sounds with materials around us, or generate those sounds with our computers.

Code of Music Week 10: Melody Project – Thomas Tai

Project Link (Temporary): http://thomastai.com:3001/
Project Code: https://github.com/thomastai1666/Open-Orchestra

Video Demo

Introduction

For this week’s project, we were asked to design and implement an interactive exploration of melody. I was inspired by the projects we saw in class, and other projects I have seen on the internet. Most notably, I like the strings example from Chrome’s music lab and the “In C” web performance of Terry Riley’s piece In C. Since we are all separated right now, I thought it would be a great time to make a networked application to make music together.

Chrome Music Lab: Strings

Design

When I think of music, it is often represented on a score like the one shown below. The higher the line, the higher the frequency of the note. The lower the line, the lower the frequency of the note is. This was my first choice for the interface. 

Source: https://blog.flat.io/music-theory-ryhthm-measure/

I made mockups of my interface in Sketch, which later evolved and changed as I used predefined bootstrap styles and added buttons and features as I was designing the application. I drew inspiration from stringed instruments and decided on using an Orchestra (cello, bass, violin, viola) as my instruments.

Code

Due to time constraints, many of the components that are used were from other projects. Most of my work was to bring together to components to make the interface I desired. I defined an Instrument object which also inherits the Magenta Player and Tone Player. For networking, I used the socket.IO library which makes use of the shiny new web sockets introduced a few years ago. I send the note value and instrument name of each note being played to all the other users online. It is somewhat hacked together like some of the code, but it works just fine. When a player plucks the string, the data is sent to the Piano Genie model to predict a suitable note to play. It makes use of LSTM (Long short term memory) model to find a suitable sequence.

Future Work

There are some minor changes I would like to make if given the time. I find when too many notes are played at once, the chords tend to be in disharmony. This is a limitation of the Piano Genie model I used in the project. It wasn’t intended for use with string instruments, so it certainly isn’t optimized for this purpose. In fact, there are probably better ways to choose notes that don’t use a machine learning model. I found using a simple scale made more pleasing sounds.

Sources and Attribution

Chrome Experiments Strings: https://musiclab.chromeexperiments.com/Strings/

String Code:
https://experiments.withgoogle.com/jam-with-chrome

Clef Images: 
https://en.wikipedia.org/wiki/Clef

Magenta Library and Demo Code: 
https://magenta.tensorflow.org/pianogenie

Socket.IO
https://socket.io/