Live Image Processing – 1st week

As I picked up my phone to start recording videos to 2nd week’s class I wanted to take use of the slow motion feature in the phone. I like this effect and i find it interesting how its transforming scenes that might come as banal, it kind of shows them in a different light, sort of the same as blowing up a picture and magnifying it to a large scale – to me this action brings out a different context to the image.

Another thing I wanted to do is to try and displace and distort, or transform the image in an analog way. I used a magnifying glass on top of the phone lens and moved it as I shot some videos. This resulted in distortion to the digital image that have an analog look/feel.

For most of the frames,  I shot at dark places or at night focusing on high contrast between really dark and bright lights. 

Some videos can be found here:

https://tinyurl.com/y92zsj56

 

 

Project Development Studio – Polynote

My dream is to create a space where people can create audio/visual objects that communicate and talk to each other and for them to rapidly change perspectives in the midst of the creation process and viewing. I hope to create 360 scenes or fragments of scenes that would live in varied places and could contribute, change or disrupt the order of those specific places.

My vision is table, a chair, a sofa, a bench or any other space for that matter that an imaginary ecosystem live atop of. This ecosystem would be comprised of living like organisms that have life of their own. They move, make sound and interact with each other in symbiosis. viewers of this ecosystem would be able to contribute their own instances to the ecosystem by hand gestures. Viewers could also change their points of view of the ecosystem and by this would change their visuals and sounds accordingly.    

My goal is to create an augmented reality multiplayer app for the creation and viewing of coral reef like ecosystems 

that are both animative and generate sound. Each ecosystem will have its own distinct features created by hand gestures of the viewers along with random features dictated by the platform. these ecosystems would be like audio/visuals compositions in space that could be viewed and heard from numerous points of view.

 

 

Polynote3D

polynote is a multiplayer musical experience generated by random notation between multiple players on a webpage. Each player in the session is randomly assigned a different instrument as he begins the experience and is given a random set of instructions to build his part.

Inspiration:

https://www.are.na/itay-niv/development-studio

R&D:

First stage: To get myself familiar with the technical side of things I started with something simple. I knew I wanted this experiment to live in the internet so Javascript was my main tool. I built a really simple experiment made out of an html table that played sounds with a webAudio library called tone.js. The sounds played in a loop as soon as pressed and stopped when deactivated:

Second stage: After having a basic sound playing in the browser, I started building a mechanism that played sequenced noted through time – a basic sequencer.

Adding notation + polynote2D

  • Adding a server that keeps everyone on the same page.
  • Adding random notation for different players and restricting the interaction to a specific time frame.  

Third stage:

Translating to 3D: 

For this part I started refactoring my 2D code in ThreeJS. a lot of things in the architecture of the code have changed in the transition, 

Stack:

Node.js

Express

Three.js

Tone.js

HTML / CSS

 

Musical Gear Table [PCOMP ICM Final]

Musical Gear Table is my PCOMP / ICM Final for the 2017 fall semester in ITP, Tisch School of Arts. It is a musical / visual experience Which draws inspiration from physical / mechanical music boxes and gives it a digital interpretation.

The user assigns different musical patterns to six different instruments through digital inputs and rotates the physical gear to produce sound. The digital layer is projected on the physical gears and give the musical patterns unique visualization.

 

Creative Process:

I have always had an attraction to music and its structure. Drums always come first – High-hat that counts the tempo, kick drum drum that lays the beat down and the snare that snaps and brings the sharp staccato to the beat. Of course, after that comes the baseline, the melody and the harmony- synths keys, and everything else.

There are a lot of ways to bring music to life, the ingredients need balance, but the first thing they need is the tempo.

The initial plan was to build a table filled with gears of different sizes. The user will be prompted to build his own composition with the gears, similar to a puzzle. When done, he can turn the main gear – thus moving all the gears in correlation to that main gear. Each gear holds a certain element of the composition – a drum, cymbal, baseline, keys, synths and more –  and the user can choose weather he wants to add it to the recipe or not (editing a certain gears notes is an option too).

Each gear’s rotation speed will be measured by a rotary sensor that will play the gears part accordingly – all parts will be synced to the same master gear.   

Another element for this project is the visual projection. On each gear will be projected a visualization of the it’s instrument’s pattern . The user could change samples and patterns through an interface incorporated in the project. 

Thought process led to incorporate different size gears that would lead to polyrhythms in the track. This idea was eventually left behind. 

The initial technical specs included a Kinect controller(which was later dropped from the plan), Max MSP, Arduino and a limited set of buttons that were to control the array of gears, that plan changed later in the process.

 

Technical / production process:

Starting with physical exploration of laser cutting acrylic sheets I built a small mockup of the mechanism. This small prototype had a rotary encoder in it that picked up the current position of the gears and transmitted it VIA Arduino’s serial to the processing. 

First sketch of mechanical gear, cut on laser cutter

Processing was the main programing language and the place where all the visual and musical instrumentation data was stored. For producing and sequencing the sound I used Max/MSP. In Max, I’ve built a main sequencer patch that received the rotary encoder data from the Arduino VIA OSC signals transmitted from Processing. It translated that data to 32 step sequencer that looked like this:

The instruments used were different drum samples and two synthesizers built in Max/MSP.

After getting the audio part to work, I’ve started to build a working demo that incorporated a projection of the instrument’s pattern on the gear and the ability to edit an instrument’s pattern. I cut a thick piece of white acrylic sheet and mounted a two gears on a plank of wood.

Next step was to scale and produce a large prototype of all the gears. In this process I designed the final version of the table in Illustrator and laser cut it from birch wood. In this process I’ve added an Adafruit trellis controller that was mounted on the bottom part of the table. this controller is programmed to scroll between gears, add/remove notes and assign patterns from 12 different banked presets. also, I have incorporated a soft potentiometer to handle pitch shifting in the notes.

 

Adding the final gears to the table:

Design for the visual representation of each instrument was made in consideration of the instrument’s tone and shape of its sound, for instance, the bass drum would be round, snare would be sharp etc. synthesizers would have the ability to change their tone and pitch so layers of shapes were stacked and patterned on the gear to display the selected note.

————————————————————-

Musical Gear Table was presented at the 2017 ITP winter Show, Below you can find a couple demo video from users in the show.

Shadow Party – Halloween Trick [PCOMP Midterm]

‘Shadow party’ is an interactive shadow play especially for Halloween. For our Intro to Physical computation midterm Hadar Ben – Tzur and I were paired together to create a Halloween trick. After a few Ideas were tossed in the air we chose ‘Shadow Party’ as our project concept. The user is invited to control the dance movements while interacting with three different knobs and explore the surprising elements in the scene.  

A look inside
 
First sketches:
 
As a prototype, we laser cut a skeleton figure and created moving joints, in order to explore the movement and shadows using servo motors and LED lights.
 

 Later in the process we designed a 3d model as a mockup before fabricating, this helped us understand proportions and dimensions
 

3D model