Our minds cannot grasp anything that’s not derived from the way we process reality through our senses. As humans, we see, touch, smell, taste and hear our reality – it is everything we’ve got. If we wish to tell meaningful stories, as artist, we need to find our most expressive tools and techniques to try and create the story that touches the core of what it’s like to be a human on all our senses. Not only to see, but to hear and touch and communicate a deeper meaning to the experience.
Our mediums are broken and can only contain few elements of our sensable reality. To work in moving image, which is one of the most expressive art forms that combine movement through time of images and sound, means to be deprived from smell, touch, and taste. To record music, most of times, means to hold back on visualizing sounds to ones listeners. It is only in the last century, out of hundreds of years of expressing ourselves through art, that we are starting to mix between the senses through technology and to express ideas and narratives with a goal of bridging gaps of these holes in our perception.
In my work I seek to incorporate and mix more senses and bring aspects of the physical into the digital and vice versa. A synesthesia of the senses in order to from a connection to the experience in ways that are unique and expressive to the subject. I want to find new ways in which different methodologies complement and enrich each other when mixed together. Our senses are not yet accustomed to technology, building new experiences that would define our behavior with technology.
I always liked those kids toys that have allot of color, made out of plastic. the bucket is one of my favorites and I think it kind of represents having the ability to have more than one function or feature but still being kind of anal about it – everything in its right place.
As I picked up my phone to start recording videos to 2nd week’s class I wanted to take use of the slow motion feature in the phone. I like this effect and i find it interesting how its transforming scenes that might come as banal, it kind of shows them in a different light, sort of the same as blowing up a picture and magnifying it to a large scale – to me this action brings out a different context to the image.
Another thing I wanted to do is to try and displace and distort, or transform the image in an analog way. I used a magnifying glass on top of the phone lens and moved it as I shot some videos. This resulted in distortion to the digital image that have an analog look/feel.
For most of the frames, I shot at dark places or at night focusing on high contrast between really dark and bright lights.
My dream is to create a space where people can create audio/visual objects that communicate and talk to each other and for them to rapidly change perspectives in the midst of the creation process and viewing. I hope to create 360 scenes or fragments of scenes that would live in varied places and could contribute, change or disrupt the order of those specific places.
My vision is table, a chair, a sofa, a bench or any other space for that matter that an imaginary ecosystem live atop of. This ecosystem would be comprised of living like organisms that have life of their own. They move, make sound and interact with each other in symbiosis. viewers of this ecosystem would be able to contribute their own instances to the ecosystem by hand gestures. Viewers could also change their points of view of the ecosystem and by this would change their visuals and sounds accordingly.
My goal is to create an augmented reality multiplayer app for the creation and viewing of coral reef like ecosystems
that are both animative and generate sound. Each ecosystem will have its own distinct features created by hand gestures of the viewers along with random features dictated by the platform. these ecosystems would be like audio/visuals compositions in space that could be viewed and heard from numerous points of view.
Musical Gear Table is my PCOMP / ICM Final for the 2017 fall semester in ITP, Tisch School of Arts. It is a musical / visual experience Which draws inspiration from physical / mechanical music boxes and gives it a digital interpretation.
The user assigns different musical patterns to six different instruments through digital inputs and rotates the physical gear to produce sound. The digital layer is projected on the physical gears and give the musical patterns unique visualization.
I have always had an attraction to music and its structure. Drums always come first – High-hat that counts the tempo, kick drum drum that lays the beat down and the snare that snaps and brings the sharp staccato to the beat. Of course, after that comes the baseline, the melody and the harmony- synths keys, and everything else.
There are a lot of ways to bring music to life, the ingredients need balance, but the first thing they need is the tempo.
The initial plan was to build a table filled with gears of different sizes. The user will be prompted to build his own composition with the gears, similar to a puzzle. When done, he can turn the main gear – thus moving all the gears in correlation to that main gear. Each gear holds a certain element of the composition – a drum, cymbal, baseline, keys, synths and more – and the user can choose weather he wants to add it to the recipe or not (editing a certain gears notes is an option too).
Each gear’s rotation speed will be measured by a rotary sensor that will play the gears part accordingly – all parts will be synced to the same master gear.
Another element for this project is the visual projection. On each gear will be projected a visualization of the it’s instrument’s pattern . The user could change samples and patterns through an interface incorporated in the project.
Thought process led to incorporate different size gears that would lead to polyrhythms in the track. This idea was eventually left behind.
The initial technical specs included a Kinect controller(which was later dropped from the plan), Max MSP, Arduino and a limited set of buttons that were to control the array of gears, that plan changed later in the process.
Technical / production process:
Starting with physical exploration of laser cutting acrylic sheets I built a small mockup of the mechanism. This small prototype had a rotary encoder in it that picked up the current position of the gears and transmitted it VIA Arduino’s serial to the processing.
First sketch of mechanical gear, cut on laser cutter
Processing was the main programing language and the place where all the visual and musical instrumentation data was stored. For producing and sequencing the sound I used Max/MSP. In Max, I’ve built a main sequencer patch that received the rotary encoder data from the Arduino VIA OSC signals transmitted from Processing. It translated that data to 32 step sequencer that looked like this:
The instruments used were different drum samples and two synthesizers built in Max/MSP.
After getting the audio part to work, I’ve started to build a working demo that incorporated a projection of the instrument’s pattern on the gear and the ability to edit an instrument’s pattern. I cut a thick piece of white acrylic sheet and mounted a two gears on a plank of wood.
Next step was to scale and produce a large prototype of all the gears. In this process I designed the final version of the table in Illustrator and laser cut it from birch wood. In this process I’ve added an Adafruit trellis controller that was mounted on the bottom part of the table. this controller is programmed to scroll between gears, add/remove notes and assign patterns from 12 different banked presets. also, I have incorporated a soft potentiometer to handle pitch shifting in the notes.
Adding the final gears to the table:
Design for the visual representation of each instrument was made in consideration of the instrument’s tone and shape of its sound, for instance, the bass drum would be round, snare would be sharp etc. synthesizers would have the ability to change their tone and pitch so layers of shapes were stacked and patterned on the gear to display the selected note.
Musical Gear Table was presented at the 2017 ITP winter Show, Below you can find a couple demo video from users in the show.