week 10 – final project development

Co-created with Zoe Margolis

For our final project, Zoe and I would like to create a device that helps dancers control visualizations that accompany their performance with movement. One of the inspirations that we explored was the work of Lisa Jamhoury. When we started thinking about such a control device we first considered using tracking sensors and software such as PoseNet or Kinect, but we decided to go a simpler way and use accelerometers. There is something very beautiful about having the piece of technology directly on your body so that you can physically feel connection to the control device. I also personally love signal processing and time series analysis, and the seeming anonymity that comes with it, i.e. no camera tracking, no body posture or face recognition recognition, just acceleration. Maybe it’s something about having worked with earthquakes for almost 10 years that draws me into accelerometers; after all, earthquake records are produced using triaxial accelerometers.

The user of this device is a performer, and they will have two accelerometers attached to their wrist. At later points we might consider introducing more sensors, but for now we start simple. The wires will run underneath a tight bodysuit to the Arduino and a battery that will be housed in a mic belt. The signal in the Arduino will be sent via Bluetooth to a computer which will create visuals using processed signal in either p5 or Touch Designer.

Our workplan for building a prototype and the progress so far is as follows:

  1. Fabricate the circuit using accelerometers, Arduino, and a battery.
    So far we were able to set up one LIS3DH triple-axis accelerometer with an Arduino and read the x, y, z accelerations using I2C synchronous communication. We have not been able to figure out how to connect the second accelerometer yet. The data is currently being output as a string for each reading. For example for the two accelerometers the output will be a time stamp and accelerations for each accelerometer:  “t, x1,y1,z1,x2,y2,z2” or “1,-0.84,4.92,8.67,-0.84,4.92,8.67”.
  2. Send the data from accelerometers over Bluetooth to p5.
    We have been able to set up Arduino as a peripheral device, and receive one string via Bluetooth on p5. However, at the moment we are having trouble receiving updated data from the Bluetooth.
  3. Gather accelerometer data along with movement ‘labels’ to build a ML algorithm to identify moves.
    We have been able to gather the data for two simple and distinct moves for the right hand as a start. The moves are arm raise and circular ‘door opening’. To produce a labeled time series data that has both accelerations and move ‘labels’ we added two buttons to the circuit each representing a move. The following video documents some of the data gathering process.

    The output file is a .csv file that contain the following data: “t, x1,y1,z1, acc1 move1, acc1 move2, x2, y2, z2, acc2 move1, acc2 move2” or “1129,-9.32,19.58,9.92,0,1,-9.32,19.58,9.92,0,0”. So far we gathered nearly ~4000 data points of the right wrist, which is enough to start experimenting with an ML model. The triple-axis accelerometer data is shown below where you can see clear peaks when the moves were made.

    records of triaxis accelerometer

  4. Train a machine learning model to recognize the moves in real-time.
    This will be one of the main focus areas over the next two weeks. We are going to try to build a Long Short Term Memory (LSTM) network using tensorflow, because LTSM tends to work well with sequential data.
  5. Create a p5 or Touch Designer visualization, where certain moves trigger changes.
    This part of the project hasn’t been started but we did find some visuals inspiration from MoMA’s Wolfgang Tillmans exhibit below. I think we can recreate a similar flowing feeling using Perlin noise in p5.
    picture by Wolfgang Tillmans in MoMA
  6. Create a more permanent version of the prototype to be worn for a Winter Show performance.
    This part of the project hasn’t been started yet.

week 9 – bluetooth and accelerometer lab

This week our ultimate goal was to connect an external accelerometer to the Arduino and transmit the X, Y, Z accelerations through bluetooth. We were not able to do it but we had partial success in both tasks. 

Accelerometer

For the accelerometer we used the LIS3DH Triple-Axis accelerometer and followed an online tutorial and an example in the Adafruit LIS3DH library (link). We were successful in getting and plotting the signal but a few questions remain: 

!questions/comments!

  • What are I2C vs. SPI wiring? Which one is better for our application?
  • Is this the right accelerometer for us to use or are there better ones?
  • How many accelerometers can we use with one Arduino? Is it possible to use multiple accelerometers with I2C wiring?

Bluetooth

Bluetooth was certainly more of a struggle, even following the bluetooth lab. We didn’t quite understand how to send three float numbers (x,y,z accelerations) using characteristics. I was also confused about how to select UUID and alsoset characteristics.

We were able to turn on the Arduino LED using p5.ble.js library and also read a button output from a pin. However, we stopped short of sending accelerometer readings via bluetooth since we didn’t know how. There are many questions remaining.

!questions/comments!

  • a very important note was that bluetooth does not connect using p5.js editor. We learned that the hard way. We were only able to connect when we made a local server with Visual Studio Code.
  • The Arduino code from p5.ble.js website was not working (we could find the peripheral Arduino). But when we tried the example Arduino BLE codes form the Arduino library, they worked.
  • What characteristic would the LIS3DH readings be?
  • What is the difference between read, write and notify?

 

week 8 – color rain

Co-created with Bosung Kim

When Bosung and I first met we quickly converged on the idea of trying to convey the feeling of water through pixel manipulation.  We sought to recreate the visual of water ripples from a drop and got inspired by a YouTube video that used an equation to visualize the z-axis of a circular wave pattern as a function of time and space (unfortunately, I lost the link to the video):

equation used for the simulation

Once we figured out the mechanics of the equation we were mesmerized by the pattern. But the question remained: what in the image to do we actually have to change to create a color ripple effect? After some experimentation we ended up remapping the z-values from the equation onto the color range (0-255) and assigning the few values within the raindrop radius to either R, G, or B values of the pixels (randomly chosen). We also varied the radius of the waves for each drop.

Ultimately, our work uses the color rain to give new life to an existing image: a desaturated desert, void of water and color.

After several minutes of color rain, the desert takes on a new life:

The full experience can be found here [link].


 

The three words that we think describe the image are: contrast, tranquility, and mesmerizing.

Attribution: The desert image was taken by Santiago Manuel De la Colina, and accessed on pexels.com. The music is from …

 

 

week 8 – asynchronous communication

This week the lab was a review of serial communication, since we had a lot of asynchronous communication in the midterm (extended blog coming soon).  There were still a few useful things that I picked up in this lab:

    • When we were doing two-way communication for the midterm we implemented the ‘handshake’ communication protocol. However,  it was very difficult to debug what Arduino was seeing since we couldn’t use the Serial Monitor.  In this lab I learned a useful trick, where you send the read Byte directly back to p5 and print it using either console.log or canvas. This is where command byte() was very useful. In Arduino, as soon as you read the byte, you can send it back:
      int inByte = Serial.read();
      Serial.write(inByte); Question: in p5 reference for byte() it says “Converts a number, string representation of a number, or boolean to its byte representation. A byte can be only a whole number between -128 and 127”. However, we were printing 255 on the screen using byte(). How does that work?
    • The graphing function was pretty cool and useful to monitor the signal. I will save that for later. Below is an example of potentiometer output:

Here are some of the pictures of the circuits I built in this lab.

stop motion: “A Spark of Passion” and “Intro to Mice”

Co-created with Kat Kitay

This week in Stop Motion production we have two experimental shorts: ‘a spark of passion’ and ‘intro to mice’.

Spark of Passion

In our first stop motion we really wanted to use inanimate objects and give them a human character, give them a story. We thought we would use objects that we interact with everyday and what better items to use than the ones in our PComp kit! We landed on the battery and battery connector, since the together, the two make things move and light up. The theme of passion immediately came up and the following video is the result.

Here are the actors and the video set-up:

ITP- 93856 001: Intro to Mice

The second slow motion video was an experiment with Pixilation, which proved to me much harder than we thought. We wanted to create the effect of a human moving like another creature. In this story, the snake professor gives an overview of Intro to Mice, slithering around the class to interact with ‘invisible’ snake students. 

 

week 5 lab – DC motors

This week I was only able to get through one lab, since we were actively working on the midterm. I successfully powered-up a DC motor using a 9V power supply and controlled it with a potentiometer. Here is the set up:

set up for a DC motor controlled by a potentiometer

The video shows the variable speed of the motor: 

 ! remaining questions!

Most of my questions from this week are about diodes:

  • What do the numbers on the diode mean and what is the difference between  different diodes?
  • How to we know the direction of the diode and which way should it be pointing when connected to collector and emitter?
  • Which was does the electricity flow, or more specifically, how does the diode prevent the wrong flow of electricity?

your evening news

Co-created by Sam De Arms

When we watch cartoons as kids, we often experience them as playful, funny, and very entertaining. The beloved 90’s cartoon ‘Hey Arnold!’, for example, tells a story of a nine year old kid who has a football shaped head and goes on fun adventures to help his friends with personal problems.

But if we rewatch our childhood cartoons as adults we realize that they often explore very serious and often troubling themes. Take the same Arnold. He constantly gets bullied by a girl at school and he is raised by his grandparents not knowing what happened to his parents.

During the creation of this project my co-creator Sam De Armes and I discussed the different cartoons we watched as children: Soviet Vinni Puh, Courage the Cowardly Dog, CatDog, Ну, погоди! and others.

We also realized that the news that we recently watched often has a similar but reserve effect: you watch them intending to hear information on serious topics, but sometimes they feel absurd or even comical.  We decided to juxtapose these two ideas through the use of synthetic media.

“Your evening news” is a video collage, that uses a combination of cartoons, the news and AI generated images. 

Process

We started by getting a sample of cartoons we watched as children, collectively gathering 1.5 hours of footage.

recording of videos to create training dataset

Then we used a custom scripts to generate ~5000 image frames from the videos to create a training set .

We used RunwayML to train a StyleGAN on the dataset. We did not know what this was going to produce but we were happy with the overall results. In the video below you can see the resemblance between the esoteric shapes and the original cartoons. In particular, we found Vinni Puh was prominently featured in a lot of the generated images.

Finally we worked in Premier Pro to create the video using original and green-screened video footage, and AI generated cartoons.

screenshot of working in premier on the project 

week 5 – functions as units of labour

This week I really wanted to experiment with recursive functions to make snow flakes. I quickly discovered that while snowflakes have a pattern, the patterns do not have a recursive relationship. Here are a few examples of my attempts to make a recursive snowflake [link to code]:

Eventually, I decided to pivot and make a recursive plant that grows when there is more sunshine :). The sun comes out as the user moves the mouse in the vertical direction [link to code]:


 

 

week 4 – patterns

This week I worked with creating patterns, sometimes with unexpected results. After some playing around I arrived at two evolving patterns:

1) Childhood reimagined

When I was a child I loved drawing a specific pattern using grid paper. It was very soothing and took alot of effort, patience and time. 

I wanted to recreate this pattern using computing, which surprisingly also took a lot of effort. While I was coding, I made several mistakes which resulted in unexpected and interesting patterns. Here are some of the ‘mistake’ results:

Finally I arrived at the right pattern:

When I made mistakes in the code, I quite liked some of the patterns, so  I decided to randomize several parameters and see what happens. Here is the result [link to code]:

 

2) Mars and Jupiter’s solar rave

I got inspired by an animation of Mars and Jupiter orbits around the sun that I saw on someone twitter

I found it very hypnotic and beautiful, reminding me of a stellar dance. I wondered what different orbits and speeds would look like. Voila, welcome to the solar rave! [link to code]
 
 

week 3 lab – can you guess the tune?

This week we played with analog input and output (or Pulse Width Modulation), testing the concepts on a speaker and a Servo motor. I also tried to do something ‘more creative’ with the speaker, so I created a little game called:

Can you guess the song in less than 10 notes?

With the help of Henry’s composition skills and Alexandra’s, Dre’s and Alfonsette’s song guessing abilities I was able to test out the game:

Hurray! Collectively the team guessed all the songs. Although I wonder how do I prompt the users to just press the button once, as opposed to continuing to hold it. Oh, and here is the set up of the circuit:

guess the tune game circuit

I was able to complete this week’s labs, although many questions remain. And my microcontroller broke at the end :(. Here is a lab highlight: potentiometer controlling the Servo motor..

 

!remaining questions!

  1. What does a transistor actually do and how does it work? What do the different pins mean?
     
  2. What is the speaker set up below? Which direction does the electricity flow in the speaker?
  3. How does one connect an audio jack?
  4. What does a capacitor do and when do we need to use it?
  5. A potentiometer doesn’t have a pulldown resistor even though it is a variable resistor. Why not?
  6. sizeOf(melody)/sizeOf(melody[0]) didn’t work when I used it to determine the length of the notes array. Why not?