Musical Gear Table [PCOMP ICM Final]

Musical Gear Table is my PCOMP / ICM Final for the 2017 fall semester in ITP, Tisch School of Arts. It is a musical / visual experience Which draws inspiration from physical / mechanical music boxes and gives it a digital interpretation.

The user assigns different musical patterns to six different instruments through digital inputs and rotates the physical gear to produce sound. The digital layer is projected on the physical gears and give the musical patterns unique visualization.

 

Creative Process:

I have always had an attraction to music and its structure. Drums always come first – High-hat that counts the tempo, kick drum drum that lays the beat down and the snare that snaps and brings the sharp staccato to the beat. Of course, after that comes the baseline, the melody and the harmony- synths keys, and everything else.

There are a lot of ways to bring music to life, the ingredients need balance, but the first thing they need is the tempo.

The initial plan was to build a table filled with gears of different sizes. The user will be prompted to build his own composition with the gears, similar to a puzzle. When done, he can turn the main gear – thus moving all the gears in correlation to that main gear. Each gear holds a certain element of the composition – a drum, cymbal, baseline, keys, synths and more –  and the user can choose weather he wants to add it to the recipe or not (editing a certain gears notes is an option too).

Each gear’s rotation speed will be measured by a rotary sensor that will play the gears part accordingly – all parts will be synced to the same master gear.   

Another element for this project is the visual projection. On each gear will be projected a visualization of the it’s instrument’s pattern . The user could change samples and patterns through an interface incorporated in the project. 

Thought process led to incorporate different size gears that would lead to polyrhythms in the track. This idea was eventually left behind. 

The initial technical specs included a Kinect controller(which was later dropped from the plan), Max MSP, Arduino and a limited set of buttons that were to control the array of gears, that plan changed later in the process.

 

Technical / production process:

Starting with physical exploration of laser cutting acrylic sheets I built a small mockup of the mechanism. This small prototype had a rotary encoder in it that picked up the current position of the gears and transmitted it VIA Arduino’s serial to the processing. 

First sketch of mechanical gear, cut on laser cutter

Processing was the main programing language and the place where all the visual and musical instrumentation data was stored. For producing and sequencing the sound I used Max/MSP. In Max, I’ve built a main sequencer patch that received the rotary encoder data from the Arduino VIA OSC signals transmitted from Processing. It translated that data to 32 step sequencer that looked like this:

The instruments used were different drum samples and two synthesizers built in Max/MSP.

After getting the audio part to work, I’ve started to build a working demo that incorporated a projection of the instrument’s pattern on the gear and the ability to edit an instrument’s pattern. I cut a thick piece of white acrylic sheet and mounted a two gears on a plank of wood.

Next step was to scale and produce a large prototype of all the gears. In this process I designed the final version of the table in Illustrator and laser cut it from birch wood. In this process I’ve added an Adafruit trellis controller that was mounted on the bottom part of the table. this controller is programmed to scroll between gears, add/remove notes and assign patterns from 12 different banked presets. also, I have incorporated a soft potentiometer to handle pitch shifting in the notes.

 

Adding the final gears to the table:

Design for the visual representation of each instrument was made in consideration of the instrument’s tone and shape of its sound, for instance, the bass drum would be round, snare would be sharp etc. synthesizers would have the ability to change their tone and pitch so layers of shapes were stacked and patterned on the gear to display the selected note.

————————————————————-

Musical Gear Table was presented at the 2017 ITP winter Show, Below you can find a couple demo video from users in the show.

Gear table – final ICM / PCOMP assignment

I have always had an attraction to music and its structure. 

Drums always come first – High-hat that counts the tempo, kick drum drum that lays the beat down and the snare that snaps and brings the sharp staccato to the beat. Of course, after that comes the baseline, the guitar and then the harmony- synths keys, and anything else.

There are a lot of ways to bring music to life, the ingredients need balance, but the first thing they need is the tempo.

I have decided to have this as the concept for my final project, an interactive audio/visual music box that is generated through physical tempo.

I plan to build a table filled with gears of different sizes. The user will be prompted to build his own composition with the gears, similar to a puzzle. When done, he can turn the main gear – thus moving all the gears in correlation to that main gear. Each gear holds a certain element of the composition – a drum, cymbal, baseline, keys, synths and more –  and the user can choose weather he wants to add it to the recipe or not (editing a certain gears notes is an option too).

Each gear’s rotation speed will be measured by a rotary sensor that will play the gears part accordingly – all parts will be synced to the same master gear.   

Another element for this project is the visual projection. From the top will be projected a visualization for each cog that is added by the user by projection mapping. The user could change samples and patterns through an interface incorporated in the project. This will be coded in processing.

The Sound part will be synthesized in MAX MSP and controlled from Processing through OSC signals.

 

Clear table before inserting gears.

Gears and patterns projected after inserting gears.

System diagram.

Components include:

  • computer running processing + Max Msp
  • Arduino
  • projector
  • kinect camera
  • Pegboard 35″X20″
  • 8 X 7 “acrylic gears with IR stickers
  • Rotary incoder
  • 2 X big push buttons
  • 1 X rotary slider
  • Long HDMI
  • 2 X Long USB Cable

Variations on circular 10 print

For this week’s assignment for Intro to Computational Media with we were to create an algorithmic design with simple parameters, using a rollover, button, or slider from scratch. 

I took inspiration from the mandalas and the 10 print CHR$(205.5+RND(1)) book by Nick Montfort, Patsy Baudoin, John Bell, Ian Bogost
Jeremy Douglass, Mark C. Marino, Michael Mateas
Casey Reas, Mark Sample, and Noah Vawter.

Interact here.

 

p5.js Versions on halftone camera

In this sketch, I continued using the ‘halftone effect’ and loaded into the sketch a live video feed using the capture video method in p5.js. The video’s pixels are being processed and converted to different shapes according to their luminance channel. Another feature in the sketch is an echo effect that is increasing over time – that effect can be reseted by clicking the screen. 

In this sketch, I used random(); operation to color certain elements of the ‘halftone effect’ + The video’s pixels are being colored by mouse movement on the screen. Another feature in the sketch is an echo effect that is increasing over time – that effect can be reseted by clicking the screen. 

In this sketch, I used random(); operation to color certain elements of the ‘halftone effect’ + The video’s pixels are being colored by a counter cycle that determines each shapes color. Another feature in the sketch is an echo effect that is increasing over time – that effect can be reseted by clicking the screen. 

P5.JS first sketch – self portrait

For the first assignment I decided to create a self portrait. 

I thought a lot about what would be interesting for me to work on, and after a few Ideas were tossed around in my notebook I decided I want to abstract the self portrait and to try and break down a high resolution image and work with the 2d matrix.

A 2D image is based of pixels distributed in matrix of X and Y. every pixel has an X value and a Y value for its position, plus holds its color attribute, for example if we look at the smiley face below X = 0, Y=0, (255,255,255,1).

Initial mock

Every pixel has its own luminance attribute that ranges between 0 (absolute black) and 255 (absolute white).

Pixel luminance – graphic representation

In the above drawing there are 6 squares that illustrate 6 different representations of luminance (the brightness channels of an each pixel).

I wanted to create an abstraction of those 6 squares to simply illustrate a graphic representation of them with p5.js 2D Primitive shapes. if we constrain each shape inside a square (a representation of a pixel) we can see that each shape fills up the square and creates an array of different brightness values. 

Pixel luminance – graphic representation

  • Part 1 – Image processing:

I started with picking a picture I liked – me destroying a donut. This picture will be used as a map for the code to determine what primitive shape (rect, ellipse, dot line or point) it needs to put on each location of the 2D matrix – by its luminance value. For optimization purposes I did a little photoshop processing to the picture (scaling it down to 50 by 50 pixels and converting to greyscale values) in order to for the code to run faster.

Pre p5.js Image processing

Part 2 – writing the code:

First thing I did was to start with an example code that loads the image file witch I have uploaded to the internet. 

 

Second, I needed to scan and load all of the original picture data into an array to get the luminance attribute, witch was the tricky part. I did that by forming a ‘nested loop’ – one loop for all the width pixels and another loop for all the height pixels in the image. After that I stored that number into variable that I named pixel (with the kind help of Barak Chamo).

Third, I wrote a series of If statements that would draw different shapes for each color range. for instance, all of the most bright pixels will be big white squares and the most dark pixels would be just a dot.

 

And the final result:

Source code