Final Blog Post for Final Project by Ryan Yuan

Project Name: Intereferit

For this project, I work alone, and the final work is a new interface for musical instrument. 

Based on the previous research I had done before the production started, the project would be related to music. The keyboard MIDI player I had found and the website of The International Conference on New Interfaces for Musical Expression, AKA NIME, gave me a lot of inspirations on making a musical instrument. As what had been mentioned by professor Eric during class, conductive tape could be an interesting way of interaction, so I finally thought of making a musical instrument that is only based on interaction built by conducive tape. The concept of utilizing conductive tape is by making a circuit based on the tape, the sensor part connects to an input port on Arduino board, the trigger part connects to the ground. 

After thinking about the way of interaction, I started thinking about how the interface would look like. I am very interested in Japanese history, and I have been playing a Japanese game that is related to the Japanese Warring States Period in recent days, so I was inspired by it and wanted to make the interface related to something about the history. There are family lines for each family during the Warring States Period, and these lines all look differently from each other and has its own meaning. Oda family and Tokugawa family are the two most famous family during that period as these two families are the two that had united the whole country, and I like these two families very much, so I want to adapt their family lines into my project. And this is the reason why my final project is the combination of their two family lines. 

The idea of the whole project is not only a musical instrument, but it is also about connection. For the physical interaction part, the two family lines indicate the actual connection of the two family in history. Then for the virtual interaction part, I want to make visualization of the instrument, that when the user is playing the instrument, there will be effects on the screen, and these effects will be passed on in some way to the next user to make connection with others. So I think of water ripples effect, as water ripples will be triggered and be passed on since they will last for some time in real life when a water ripple is triggered. For the effect of water ripples in processing, the water ripples will be realized by pixel manipulation, that the RGB value will be passed to other pixels when a water ripple is triggered at one pixel, to make it look like spreading. And I am thinkging of the way of how to control where to trigger the water ripples on the screen, so I have thinked about color tracking by using camera capture. The concept is that, if the object is in certain RGB value and if I set a condition to only track those pixels and make them visible, the result will be like as if I am doing object tracking. So I get a traditional Japanese ghost mask which is red, not only to fit the Japanes style and realize the tracking function, but also to fit the concept that, we are the ghost in the vision of the camera, and we will bring intereference to the pixels.

Now talking about the reason why I name the project Intereferit, it is a word combined with interefere and inherit, intereference means we will interefere the pixels and these intereference, or as to say the water ripples will be passed on to the next user on the screen, which is inherit. 

For the production part. The physical component, firstly, the musical instrument. I get the picture of the two family lines online, but find out that there are so many gaps in these pictures that if I am going to laser cut these two pictures will be resulting in have a lot of scattered parts. So I have to connect these parts to make them together, I have been struggled for this part, I want to connect these parts in AI, but I barely know how to draw in AI, so I waste hours figuring out the problem and result in nothing. I finally finish the work with Photoshop and then inport the file into AI to do the laser cutting. I want to make the family lines bended to make the physical part looks stereoscopic, but not just two flat pictures to tap on, and this fits the idea of a new interface that is designed by me. Since the icon should be bended in different direction, so wood can not be used to do the cutting as it can only be bended in one way, so I use acrylic board to make the icons. And I use the thermal spray gun in the fab lab to do the bending job, I heat the part where I need to bend, and then when it is hot enough, it will be bended due to the property of plastic, and then I can modify the angle to make it look like how I want it to be. Then I use AB gum to stick the two icons together to finish the physical part. Then I need to stick the tape onto it, to make the keys to tap on for the interaction. I first stick thirty tapes on the instrument to make thirty keys, and each is connected to a wire connected to the breadboard. I also borrow an Arduino Mega board to make sure I have enough input ports. But the result is that, since there are too many wires, and it is hard to stick the wires with the tapes, and it is very often that some wires will fall, and some are not well connected. Also due to the conductivity and the resistance of the conductive tape, some tape are not sensitive. So the result is that, only nighteen keys survive. And during the process of connecting the wire to the board, it is very hard to clear up all the wires since there are too many of them, also it takes me a long time to figure out which key corresponds to which port to do the coding. For the trigger, I get to rubber gloves, since they are easy to wear even though they are hard to take off. And also the wires that I stick on they will not fall off easily since the they are well sticked on the gloves. But I only use the wires to touch the tapes, but not to use tape to connect with tape due to the resistence problem.

For the coding part, the basic idea is that, each port is connected to certain sound file that are all one-shot sound. So these are notes from C4 to C6, 11 notes in total, and I also have three keys for drum set and bass, two keys for mode switch including Japanese style and future electronic one. The water ripple effect is realize by pixel manipulation combined with the computer vision object tracking. The last part of the documentation is the code.

For the reflection, the design of the project fits the idea of a new interface, but the interaction is not sensitive enough due to the problem of connecting circuit based on condctive tape. So it is hard to really play on the instrument, and also people are hard to understand the function of the mask and the meaning of the ripples on the screen, so the delivery of the concept is not clear as I have imagined. This project is a trial for making a new interface for musical instrument, but I need to reconsider a better way of interaction next time. And also for the meaning of the project, I want to show people a new interface of a musical instrument, to let them play with it, and also some users may think of the concept of connection between others based on the processing image.

     

CODE for Processing:

import processing.serial.*;
import processing.sound.*;
import processing.video.*;

ThreadsSystem ts;//threads
SoundFile c1;
SoundFile c2;
SoundFile c3;
SoundFile e1;
SoundFile e2;
SoundFile f1;
SoundFile f2;
SoundFile b1;
SoundFile b2;
SoundFile a1;
SoundFile a2;
SoundFile taiko;
SoundFile rim;
SoundFile tamb;
SoundFile gong;
SoundFile hintkoto;
SoundFile hintpeak;
Capture video;
PFont t;

int cols = 200;//water ripples
int rows = 200;
float[][] current;
float[][] previous;
boolean downc1 = true;
boolean downc2 = true;
boolean downc3 = true;
boolean downe1 = true;
boolean downe2 = true;
boolean downf1 = true;
boolean downf2 = true;
boolean downb1 = true;
boolean downb2 = true;
boolean downa1 = true;
boolean downa2 = true;
boolean downtaiko = true;
boolean downtamb = true;
boolean downrim = true;
boolean title = true;
boolean downtitle = true;
boolean koto = true;
boolean peak = false;
boolean downleft = true;
boolean downright = true;

float dampening = 0.999;

color trackColor; //tracking head
float threshold = 25;
float havgX;
float havgY;

String myString = null;//serial communication
Serial myPort;
int NUM_OF_VALUES = 34; /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues; /** this array stores values from Arduino **/

void setup() {
fullScreen();
//size(800, 600);

ts = new ThreadsSystem();//threads

cols = width;//water ripples
rows = height;
current = new float[cols][rows];
previous = new float[cols][rows];

setupSerial();//serialcommunication

String[] cameras = Capture.list();
printArray(cameras);
video = new Capture(this, cameras[11]);
video.start();
trackColor = color(255, 0, 0);

c1 = new SoundFile(this, "kotoc.wav");//loading sound
e1 = new SoundFile(this, "kotoE.wav");
f1 = new SoundFile(this, "kotoF.wav");
b1 = new SoundFile(this, "kotoB.wav");
a1 = new SoundFile(this, "kotoA.wav");
c2 = new SoundFile(this, "kotoC2.wav");
e2 = new SoundFile(this, "kotoE2.wav");
f2 = new SoundFile(this, "kotoF2.wav");
b2 = new SoundFile(this, "kotoB2.wav");
a2 = new SoundFile(this, "kotoA2.wav");
c3 = new SoundFile(this, "kotoC3.wav");
rim = new SoundFile(this, "rim.wav");
tamb = new SoundFile(this, "tamb.wav");
taiko = new SoundFile(this, "taiko.wav");
gong = new SoundFile(this, "gong.wav");
hintpeak = new SoundFile(this, "hintpeak.wav");
hintkoto = new SoundFile(this, "hintkoto.wav");

}

void captureEvent(Capture video) {
video.read();
}

//void mousePressed() {
// if(down){
// down = false;
// int fX = floor(havgX);
// int fY = floor(havgY);
// for ( int i = 0; i < 5; i++){
// current[fX+i][fY+i] = random(500,1000);
// }
// }

// sound.play();
//}

//void mouseReleased() {
// if(!down) {
// down = true;
// }
//}

void draw() {
background(0);

/////setting up serial commuicatioin/////
updateSerial();
//printArray(sensorValues);
//println(sensorValues[0]);
int fX = floor(havgX);
int fY = floor(havgY);

if(sensorValues[2] == 0){
if(downleft){
downleft = false;
if(koto){
koto = false;
peak = true;
//if(koto){
c1 = new SoundFile(this, "pc1.wav");//loading sound
e1 = new SoundFile(this, "pe1.wav");
f1 = new SoundFile(this, "pf1.wav");
b1 = new SoundFile(this, "pb1.wav");
a1 = new SoundFile(this, "pa1.wav");
c2 = new SoundFile(this, "pc2.wav");
e2 = new SoundFile(this, "pe2.wav");
f2 = new SoundFile(this, "pf2.wav");
b2 = new SoundFile(this, "pb2.wav");
a2 = new SoundFile(this, "pa2.wav");
c3 = new SoundFile(this, "pc3.wav");
rim = new SoundFile(this, "Snare.wav");
tamb = new SoundFile(this, "HH Big.wav");
taiko = new SoundFile(this, "Kick drum 80s mastered.wav");

}
hintpeak.play();
}
}
if(sensorValues[2] != 0){
if(!downleft) {
downleft = true;
}
}

if(sensorValues[4] == 0){
if(downright){
downright = false;
if(peak){
koto = true;
peak = false;
c1 = new SoundFile(this, "kotoc.wav");//loading sound
e1 = new SoundFile(this, "kotoE.wav");
f1 = new SoundFile(this, "kotoF.wav");
b1 = new SoundFile(this, "kotoB.wav");
a1 = new SoundFile(this, "kotoA.wav");
c2 = new SoundFile(this, "kotoC2.wav");
e2 = new SoundFile(this, "kotoE2.wav");
f2 = new SoundFile(this, "kotoF2.wav");
b2 = new SoundFile(this, "kotoB2.wav");
a2 = new SoundFile(this, "kotoA2.wav");
c3 = new SoundFile(this, "kotoC3.wav");
rim = new SoundFile(this, "rim.wav");
tamb = new SoundFile(this, "tamb.wav");
taiko = new SoundFile(this, "taiko.wav");
gong = new SoundFile(this, "gong.wav");

}
hintkoto.play();
}
}
if(sensorValues[4] != 0){
if(!downright) {
downright = true;
}
}

if(sensorValues[0] == 0){
//println("trigger");
println(title);
if(downtitle){
downtitle = false;
if(title){
title = false;
}
else if(!title){
title = true;
}
}
if(!downtitle){
downtitle = true;
}
}

if(sensorValues[19] == 0){//c1
//println(down);
//println(sensorValues[19]);
if(downc1){
downc1 = false;
println("yes");
println(downc1);
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
c1.play();
}
}
if(sensorValues[19] != 0){
if(!downc1) {
downc1 = true;
}
}

if(sensorValues[26] == 0){//e1
if(downe1){
downe1 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
e1.play();
}
}
if(sensorValues[26] != 0){
if(!downe1) {
downe1 = true;
}
}

if(sensorValues[31] == 0){//f1
if(downf1){
downf1 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
f1.play();
}
}
if(sensorValues[31] != 0){
if(!downf1) {
downf1 = true;
}
}

if(sensorValues[20] == 0){//b1
if(downb1){
downb1 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
b1.play();
}
}
if(sensorValues[20] != 0){
if(!downb1) {
downb1 = true;
}
}

if(sensorValues[9] == 0){//a1
if(downa1){
downa1 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
a1.play();
}
}
if(sensorValues[9] != 0){
if(!downa1) {
downa1 = true;
}
}

if(sensorValues[15] == 0){//c3
if(downc3){
downc3 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
c3.play();
}
}
if(sensorValues[15] != 0){
if(!downc3) {
downc3 = true;
}
}

if(sensorValues[23] == 0){//c2
if(downc2){
downc2 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
c2.play();
}
}
if(sensorValues[23] != 0){
if(!downc2) {
downc2 = true;
}
}

if(sensorValues[16] == 0){//e2
if(downe2){
downe2 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
e2.play();
}
}
if(sensorValues[16] != 0){
if(!downe2) {
downe2 = true;
}
}

if(sensorValues[11] == 0){//f2
if(downf2){
downf2 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
f2.play();
}
}
if(sensorValues[11] != 0){
if(!downf2) {
downf2 = true;
}
}

if(sensorValues[12] == 0){//b2
if(downb2){
downb2 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
b2.play();
}
}
if(sensorValues[12] != 0){
if(!downb2) {
downb2 = true;
}
}

if(sensorValues[17] == 0){//a2
if(downa2){
downa2 = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
a2.play();
}
}
if(sensorValues[17] != 0){
if(!downa2) {
downa2 = true;
}
}

if(sensorValues[7] == 0){//rim
if(downrim){
downrim = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
rim.play();
}
}
if(sensorValues[7] != 0){
if(!downrim) {
downrim = true;
}
}

if(sensorValues[8] == 0){//taiko
if(downtaiko){
downtaiko = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
taiko.play();
}
}
if(sensorValues[8] != 0){
if(!downtaiko) {
downtaiko = true;
}
}

if(sensorValues[28] == 0){//tamb
if(downtamb){
downtamb = false;
for ( int i = 0; i < 10; i++){
current[fX+i][fY+i] = random(255,500);
}
//ts.addThreads();
//ts.run();
tamb.play();
}
}
if(sensorValues[28] != 0){
if(!downtamb) {
downtamb = true;
}
}
////water ripples/////
loadPixels();
for ( int i = 1; i < cols - 1; i++) {
for ( int j = 1; j < rows - 1; j++) {
current[i][j] = (
previous[i-1][j] +
previous[i+1][j] +
previous[i][j+1] +
previous[i][j-1]) / 2 -
current[i][j];
current[i][j] = current[i][j] * dampening;
int index = i + j * cols;
pixels[index] = color(current[i][j]);
}
}
updatePixels();
float[][] temp = previous;
previous = current;
current = temp;

/////drawing threads///
ts.addThreads();
ts.run();

////head tracking///
video.loadPixels();
threshold = 80;

float avgX = 0;
float avgY = 0;

int count = 0;

// Begin loop to walk through every pixel
for (int x = 0; x < video.width; x++ ) {
for (int y = 0; y < video.height; y++ ) {
int loc = x + y * video.width;
// What is current color
color currentColor = video.pixels[loc];
float r1 = red(currentColor);
float g1 = green(currentColor);
float b1 = blue(currentColor);
float r2 = red(trackColor);
float g2 = green(trackColor);
float b2 = blue(trackColor);

float d = distSq(r1, g1, b1, r2, g2, b2);

float hX = map(x,0,video.width,0,width);
float hY = map(y,0,video.height,0,height);

if (d < threshold*threshold) {
stroke(255,0,0);
strokeWeight(1);
point(hX, hY);
avgX += x;
avgY += y;
count++;
}
}
}

// We only consider the color found if its color distance is less than 10.
// This threshold of 10 is arbitrary and you can adjust this number depending on how accurate you require the tracking to be.
if (count > 0) {
avgX = avgX / count;
avgY = avgY / count;

havgX = map(avgX,0,video.width,0,width);
//havgY = map(avgY,0,video.height,0,height);
// Draw a circle at the tracked pixel
fill(255,0,0,100);
noStroke();
textSize(50);
if(koto){
text("koto",havgX,havgY);
}
if(peak){
text("peak",havgX,havgY);
}
}

if(title){
textSize(200);
fill(255,0,0);
text("Interferit",width*0.3,height/2);
textSize(40);
text("Wear on the mask and the claws, using your enchanted vessel to interfere the world of pixels!",width*0.03,height*0.7);
//println(1);
}
if(!title){
fill(0);
}
}

float distSq(float x1, float y1, float z1, float x2, float y2, float z2) {
float d = (x2-x1)*(x2-x1) + (y2-y1)*(y2-y1) +(z2-z1)*(z2-z1);
return d;
}

//void keyPressed() {
// background(0);
//}

void setupSerial() {
printArray(Serial.list());
myPort = new Serial(this, Serial.list()[ 0 ], 9600);
// WARNING!
// You will definitely get an error here.
// Change the PORT_INDEX to 0 and try running it again.
// And then, check the list of the ports,
// find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----"
// and replace PORT_INDEX above with the index number of the port.

myPort.clear();
// Throw out the first reading,
// in case we started reading in the middle of a string from the sender.
myString = myPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII
myString = null;

sensorValues = new int[NUM_OF_VALUES];
}

void updateSerial() {
while (myPort.available() > 0) {
myString = myPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII
if (myString != null) {
String[] serialInArray = split(trim(myString), ",");
if (serialInArray.length == NUM_OF_VALUES) {
for (int i=0; i<serialInArray.length; i++) {
sensorValues[i] = int(serialInArray[i]);
}
}
}
}
}

Puppet – Eric Shen – Eric

Project name: Puppet

CONCEPTION AND DESGIN

        Originally, our understanding of the interaction between the users and our project only focuses on showing our theme of social expectation. In order to better explain our theme, we want the user to be the forces in the society that force people behave in a certain way and impose the social expectation on others, while the puppet is the representative of the people who are being controlled and meet the social expectation. Therefore, the initial interaction that we thought of was to let users use the keyboard and the mouse to control the simulative image of the puppet in Processing to set a pose.  After that, the real puppet on the stage would first change several random poses, and finally stay at the pose that the user sets. For the purpose of using Arduino to make the real puppet move, we chose to use 4 servo motors to control the legs and the arms of the puppet. Our criteria for motors is that it can be easily attached with things and it can rotate within a certain angle in a precise way. Stepper motor was once under our consideration, but it’s hard to be attached with things. Another disadvantage of stepper motor is that each stepper motor needs power supply. If we use stepper motors, it needs a huge amount of power and if something goes wrong in the circuit, there would be potential danger. Due to the reasons listed above, we gave up using stepper motors. In order to make the users resonate with our a bit sad theme, we need to find a puppet that is not funny and childish. Finally, after a long time of searching, we decided to use this particular puppet.

The Puppet

When selecting the material to build the stage, we first thought of using laser cut to create a delicate box. Yet, we also need to contain all the components including the Arduino, the puppet and the servo motors inside this box leading to the fact that this stage will be of a large size. If we chose to laser cut, it will use too much materials in the fabrication room. Therefore, eventually, we chose to use a carton box as the stage and the container of the components. 
We also used 3D printing to make the parts attached to both the servo and the string connected to the puppet more stable. We first use the cardboard to build those, but it turned out that they were too easy to be bent and couldn’t stand the resistance of the string.

1
With cardboard
3
With 3D Printing
2
The basic concept

FABRICATION AND PRODUCTION

        According to our original plan, one of the most important steps in the process of creating the project was to make the  real puppet move accordingly with the digital puppet in Processing. After my partner and I both finished the coding of the Arduino and Processing, we started to test how the data from Processing could be sent to the Arduino. At first, I thought that I needed to figure out how to map the data that I had in Processing to the angle of the servo motors. Then I suddenly realized that I could just create another 4 new variables that stand in Processing and directly transfer them to Arduino to make the servo motors rotate, which turned out to be a success. 
After we got over this most significant technical problem in our project, we sought advice from the fellows, they pointed out that it may be hard for the users to perceive and understand of our theme of social expectation through such a simple interaction. In addition, we present the exact same thing both on Arduino and Processing, which is not of much use, but even a bit unnecessary and redundant. They asked a question that we couldn’t answer: why would I interact with the computer to make the puppet move which does not make sense instead of just controlling it with physical interaction. In that way, the interaction would be more interesting and easier to understand. After getting the suggestion from the fellows, we reflected on our project. First, when I thought of the definition that I gave to a successful interactive project in the previous research, this final project that I was working actually contradicts the very definition that I gave before because the interaction with the project is mundane and the users would know clearly about how their interaction influents the puppet. This project is more like a display of things instead of being an interactive one. After due consideration, we decided that we would also make the curtain controlled by the Arduino so that the users can interact with it. The puppet shown in Processing would be black and white to be projected on the stage so that it can be interpreted as the shadow of the real puppet. 

        During the user test session, the technical basis of our project completely changed. After we explain the theme of the project, Professor Marcela said that our theme is intriguing and plausible, but with our original plan for the project, we couldn’t explain the theme with logic. The first suggestion that she gave us was similar to the suggestions from the fellows. We should not display the almost exact same thing both on Arduino and Processing and we should use the cross to interact with the project instead of merely using the keyboard and the mouse. In that way, this project makes more sense and the interaction would be more interesting and perceivable. Besides, she puts forward an interesting idea that we could use the webcam to capture the users’ face to replace the face of the puppet so that we can show the users that they are also being controlled while trying to control others and the logic of this theme is clear. Another useful advice that we got from this session was that we could add a voice of the puppet and make some lines for the puppet to make the theme clearer. 
We were transferring data from Processing to Arduino, but now we needed to switch to transferring data from Arduino to Processing. The sensor that a fellow recommended us to use was the accelerometer. And some weird things happened after I applied it to the project. When I was testing the each two servo motors with x-axis or y-axis, they work fine. But when I tested with all four servo motors together, the code ran well for a period of time, and then the Arduino Uno would be dead and the Arduino Uno couldn’t connect to the computer. This happened one day before the presentation. Professor Marcela and Tristan both came to help and examined the code and the circuit, they were both good. After we worked together trying to find the problem for a long time, they suggested me to either switch an accelerometer or just switch the sensor to tilt switch sensor. After I changed every components in the circuit, it still failed to run normally. Eventually, I gave up using the accelerometer, and used two tilt switch sensor in my project to control the movement of the arms and legs respectively. And the logic is, if the left arm rises up, the right arm would fall down, vice versa. Though the tilt switch sensor can only use digital output, but it provides stability for the rotation of the servo motors because the angle of each rotation is fixed. 
Another difficulty is to map the transferred data in Processing to a certain range so that the movement of the legs and arms can accord with the real puppet. After a lot of testing and calculation, we made it work. Another problem was that the animation in Processing was not smooth enough. The legs and the arms would look like jump to a certain position. Then, Tristan introduced me a function called lerp(); which solved the problem and I apply this method to control the movement of the string. 

The outlook
6
The instruction
2
The explanation

CONCLUSIONS:

        The goal of our project is to show users the theme of social expectation. In the society, there is a phenomenon that people are trying to impose their social expectation on others. But while they are giving out their social expectation on others, they themselves are also being controlled and meeting others’ social expectation of them. In my preparatory research and analysis, my definition of a successful interactive project was that the interaction between users and the project should straightforward so that the users can tell how their interactions affect the project. The project should have many forms of interaction instead of merely one type.  At the same time, the project should have a specific meaning. From my perspective, I think our project this time actually mostly align with my definition of a good interactive project. The audience can tell the logic behind the movement of the puppet with cross tilting. Besides, the meaning of our project which is about social expectation is really clear and has its explainable logic. The aspect that it doesn’t align the definition is that the interaction of our project only contains one forms of interaction which is tilting the cross if we don’t the process of taking a selfie into account. The users interaction is that they hold and tilt the cross, trying to figure out how it controls the puppet, while listening to the background music and the monologue of the puppet. The only thing that is not expected by us before is that the audience would neglect our projection on the wall because they focus too much on looking at the puppet inside the box. If we have more time, we will make the instructions more clear. Moreover, we would probably make the whole process of the interaction longer so that the user can have time to reflect on what is going on and figure out the theme by themselves. Another thing is, we should project the animation of Processing inside the stage in order to let the audience see what is going on in Processing and for the purpose of integrating Arduino and Processing better. From the process of building this project, I learned that things would always go as how you expect, just like what happens to the accelerometer, but what we can do is to be patient and find an alternative or find what is going wrong. 

The Whole Process

Code for Arduino

3
Code for Arduino

Code for Processing 

import processing.sound.*;
SoundFile sound;
SoundFile sound1;
import processing.video.*; 
Capture cam;
PImage cutout = new PImage(160, 190);

import processing.serial.*;

String myString = null;
Serial myPort;
int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/

PImage background;
PImage body;
PImage arml;
PImage armr;
PImage stringlr;
PImage stringar;
PImage stringal;
PImage legl;
PImage stringll;
PImage legr;
float yal=100;
float yll=0;
float yar=0;
float ylr=0;
float leftangle=PI/4;
float rightangle=-PI/4;
float leftleg = 570;
float rightleg = 570;
float armLerp = 0.22;
float legLerp = 0.22;
float pointleftx =-110;
float pointlefty =148;
PImage body2;
boolean playSound = true;
void setup() {
  size(850, 920);
  setupSerial();
  cam = new Capture(this, 640, 480);
  cam.start(); 
  background = loadImage("background.png");
  body=loadImage("body.png");
  arml=loadImage("arml.png");
  stringal=loadImage("stringal.png");
  armr=loadImage("armr.png");
  legl=loadImage("legl.png");
  stringll=loadImage("stringll.png");
  legr=loadImage("legr.png");
  stringar=loadImage("stringar.png");
  stringlr=loadImage("stringlr.png");
  body2 =loadImage("body2.png");
  sound = new SoundFile(this, "voice.mp3");
  sound1 = new SoundFile(this, "bgm.mp3");
  sound1.play();
  sound1.amp(0.3);
 
  
}


void draw() {
  updateSerial();
  printArray(sensorValues);
  if (millis()<15000) {
    if (cam.available()) { 
      cam.read();
    } 
    imageMode(CENTER);

    int xOffset = 220;
    int yOffset = 40;

    for (int x=0; x<cutout.width; x++) {
      for (int y=0; y<cutout.height; y++) {
        color c = cam.get(x+xOffset, y+yOffset);
        cutout.set(x, y, c);
      }
    }

    background(0);
    image(cutout, width/2, height/2);

    fill(255);
    textSize(30);
    textAlign(CENTER);
    text("Place your face in the square", width/2, height-100);
    text(15 - (millis()/1000), width/2, height-50);
  } else { 
    if (!sound.isPlaying()) {
      // play the sound
      sound.play();
     
      // and prevent it from playing again by setting the boolean to false
    } 
    imageMode(CORNER);
    image(background, 0, 0, width, height);
    image(legl, 325, leftleg, 140, 280);  
    image(legr, 435, rightleg, 85, 270);
    image(body, 0, 0, width, height);
    if (millis()<43000) {
      image(body, 0, 0, width, height);
    } else {
      image(cutout, 355, 95);
      image(body2, 0, 0, width, height);
 
      sound.amp(0);
    }
    arml();
    armr();
    //stringarmleft();
    image(stringal, 255, yal, 30, 470);
    image(stringll, 350, yll, 40, 600);
    image(stringar, 605, yar, 30, 475);
    image(stringlr, 475, ylr, 40, 600);

    //if(sensorValues[0]=90){
    //}
    //else if (){
    //}
    // use the values like this!
    // sensorValues[0] 
    int a = sensorValues[0];
    int b = sensorValues[1];
    float targetleftangle= PI/4 + radians(a/2);
     float targetrightangle= -PI/4 + radians(a/2);
     float targetleftleg= 570+b*1.6;
     float targetrightleg= 570-b*1.6;
     
     leftangle = lerp(leftangle, targetleftangle, armLerp);
     rightangle = lerp(rightangle, targetrightangle, armLerp);
     leftleg = lerp(leftleg, targetleftleg, legLerp);
     rightleg = lerp(rightleg, targetrightleg, legLerp);
     
float targetpointr = -100-a*1.1;
float targetpointl = -120+a*1.1;
float targetpointr1 = -50+b*1.3;
float targetpointr2 = -50-b*1.3;
yal= lerp(yal, targetpointr, armLerp);
yar = lerp(yar,targetpointl,armLerp);
yll= lerp(yll,targetpointr1,legLerp);
ylr = lerp(ylr,targetpointr2,legLerp);

    //delay(10);
  }
}

void arml() {
  pushMatrix();
  translate(375, 342);
  rotate(leftangle);
  image(arml, -145, -42, 190, 230);
  fill(255, 0, 0);
  noStroke();

  popMatrix();
}



void armr() {
  //fill(0);
  //ellipse(500,345,10,10);
  pushMatrix();
  translate(490, 345);
  rotate(rightangle);
  //rotate(millis()*PI/800);
  image(armr, -18, -30, 190, 200); 
  popMatrix();
}

void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 11 ], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Final Project-clover

My project: Cooperate! Partner!

Concept: I get my idea of this project from a conversation I had with my friend A She was also an IMA student working on another project. She looked pale and complaint about her partner. This is not the first time I hear complains about partner. I also had this feeling when I did my first project in Communication Lab. It was a project needed to be done by two people but only one people finish the whole thing. It makes me feel worse when I saw other team working happily and enjoy the process of teamwork. Other team spare the work and two people both engage in the working process. Not only me and friend A have this feeling, when I further carry out the research, I found that many IMA students also had bad experience when they do group project. During the mid-term and the final, there are always quarrels between partners and complaints about partners. This makes me think that, can I use a game to improve group work experience letting people enjoy working with their teammate. Then I began my research on what lead to the bad experience. I interviewed some students in the studio. Friend B: “My partner never listened to me. This makes me really angry. We could have done better if he listened and adapt some of my suggestions”. Friend C: “I always do most of the work, my partner did nothing. I feel unfair”. Friend D: “I didn’t participate that much as my partner expected because I really don’t know she needs me, I think she is doing quite well on her own”.To sum up, I found the following things lead result in bad experience in teamwork. First is the free-rider problem which is only one person do the work and the other paying not effort. This make the person doing the entire work feels unfair. Second is the communication problem. Partners lack communication and corporation. Each stick to own ideas and doesn’t cooperate. The first bad result of lacking communication is the created project often not satisfying. The second bad result is people feel they are not being understood by their partners which often leads to quarrels and emotional breakdown. The third bad result is if you don’t tell your partner you need help, he/she may not know you need help and he/she couldn’t offer you help. So I plan to design a game which force people to engage, force them to cooperate with their teammates while at the same time still enjoy playing this game. Idea 1 is to use an accelerometer to let the user1 rotate one piece of the puzzle and user2 rotate another piece then by finding the right angle letting two pieces gather together to finish the puzzle. I plan to use the distance sensor to let the user move the piece up and down. By letting each player rotate to find the appropriate angle and control the height of the piece, I plan to make each of them engage in the game which equals to participate and do your own part of job in the teamwork. Then by letting them adjust the height and the angle to form the puzzle, I plan to make them adapt to the changes of their teammate and cooperate to form a puzzle which equals to the communicate and cooperation in the teamwork. I also set the player to use both their hand and leg to force them lose balance so they need to reach their partner for help which also represents the communication part in the project.

(idea 1)

(idea2)

Then I did user test. I found that users can still keep balance quite well when they use their hands to rotate and raise their legs at the same time. They also said the game is too easy and should be more difficult. In order to let them losing balance enhancing the cooperation and communication, I change to idea 2 . This time, I let the users bend or straight their knee to control the left and right of the square using flexible sensor. I keep the distance sensor so the user have to also change the height of the square at the same time. To enhance the concept that both need to engage in the game and do their own work, control their own work, I make the square more difficult for the user to control. By setting x and y adding different numbers(x1 -= 30; x1 += 10; y1 += 23; y1 -= 10;) I want the user to keep a certain position for a longer time which is more difficult to control their body linking to the point that you need to make good control of your own part when working on the project. Also by adding this difficulty in controlling balance, I force them to reach to their partner for help enhancing the communication. I also set the sensor value to a certain point, that only above this level, the square go higher, if you don’t reach this height, even you raise your knee, the square will still not go up. I set this because, to make a good project, you need to pay effort, you need to reach a creation level. I transforming the mind work, the mental stress into physical body move, body stress. I hope by doing this, the player can move their body doing some exercise to relief mental stress, while at the same time, partners can have a stronger feeling that their partner is stretching, they really need your help and your participation, letting both players feel the importance of teamwork and cooperation. If you don’t cooperate participate and go to the top of the canvas, your partner have to pay a lot of effort to reach you and complete the game(project). Mental stress may be hard to see but body sweating is really obvious. I hope player can have the sense that if they don’t participate and cooperate, their partners have to keep raising their leg which is really painful. I hope they can feel the importance of participation and cooperation. I also don’t want to give only one absolute  solution to this game because each person is different, the team is also different, their way of doing project is different, their solutions are also different. Different teams can find their own suitable, easy approach solution to gather together. But when doing the user test, there is actually a easiest and energy saving solution which is gather together at the bottom of the canvas no need to raise the leg that height.

(the easiest solution)

I set this also linking back to the range setting above. It also means that if you find the easiest way of doing the project, you may not pay that much energy to achieve that level but you can still achieve the same goal gathering together. But this solution needs player to explore to find out. Just like doing project, there are many ways to complete the work, but you need to explore to find the easiest way to do that or you may pay a lot more effort to reach the same goal. 

The sensor I used and the code

(the flexible sensor on the knee controlling left and right of the square)

(the distance sensor sensing the distance to the ground)

(tape up to get accurate statistic)

User test:Too much cables, easy to tripped and looks ugly. So I tape the cable together to prevent user form being tripped and dropped while also make it looks better.

User test, they said the Arduino and the cable is ugly so I later cut a box to hind the Arduino and breadboard to make it looks beautiful

(use this equipment make the user easier to put it on the knee)

The lay out:I set the background into completely white adding no other design because I prototype some background with cool effect but the users said the background setting confuse them, the background made them hard to focus on the movement of the square soI set the background into completely white to let the user focus on the movement of the square. I also set the squares’ color into red and blue which is easier to recognize. And by adding the sentence “come together” this is like the word puzzle letting the user know they need to come together to form the phrase. I also set the blue square controlling by the left player on left corner so the player have a clear understanding of which square is controlled by him. The red square start from the right down corner so the player on the right know he is playing the red one. Also changing from the user test feedback, some team said the game before which both start from the left up corner is too easy, so I add the distance between them, to make the player do more movement to gather together.

(before start here)

(now start here)

Also according to the user test feedback, they suggested I should set up a rules so they know the knee is controlling the left and right, which square is for which, also you can only touch your partner to keep balance, so I add a instruction before the game. They also ask the meaning of this game, suggested that I should add a explanation at the end. I made both of them colorful to add the visual effect(feeling).

I also add a piece of relaxing music at when the game is finish because 1. Some user said they need a piece of music to relax after the body stretching which is difficult and energy consuming. 2. They need a clear sign to let them know they finish the game to cheer up.

(so I add this cheerful and happy music at the end along with the instruction).

User test: Some user wear thick trousers that affect the distance sensor. I 3-D printing this pole adding on the leg equipment to make the distance sensor sense the distance accurately.

Also after the user test in class, I change the words into video to make it fun and more connected with the game.

The introduction video:

At the beginning, I used the black and white to make the game looks mysterious. I also hide my face to raise players’ interests to play this game, make the interaction continue to the game.(using photo booth, citing in reference).

The link for the video: https://drive.google.com/open?id=18HRZsgneyYCVyAYbskO4H3m__SwozWQc

(now)

(before)

The ending video:

At the end I made the video colorful to cheer up and hope the player feel relaxing. I hope players find happy that the world suddenly become colorful when they finish the game. Making them satisfied with their complement of the game.

The link to the video: https://drive.google.com/open?id=1P5Yadwg5hcV5hcGjtE5GGbe-GT3kq4PJ

(now)

(before)

During the user test, as shown in the pictures, I changed some design to make the sensors works better, to let the user know how to play without my explanation and convey the meaning behind this project. But there are also problems that I didn’t solve. First is taking the user words’ for example. Leah and Eric(together): I like this game because the physical interaction is funny and the concept is good. But I wonder can the game be more difficult and have more levels? Ryan: It is a fun game but it’s really difficult to play. The first failure is I couldn’t find a appropriate difficult level for this game to make every user feel the game is fun and enjoy the difficult level. During the user test, for teams that worked before, it’s easier for them to complete this game, but for players who form up immediately, they feel difficult and energy consuming. Considering the concept, I plan to let everyone notice the importance of group work but no costing that much energy. However if I lower the difficult level, as a game, it’s less playful for other teams. The balance between conveying concept and keeping the play funny and playful is a part that I didn’t do well. Second is my project can’t really improve and change the teamwork situation. By playing this game, players feel fun and may aware of the problems by seeing the concept but when this game may doesn’t help much when they actually working on the project. The project raise awareness but don’t solve the problem.

I hope by playing this game, players be a better teammate and have better experience in group work. I hope I can let those who don’t have group work experience know what is group work and how can do a good group work. I hope I can achieve this goal by using a fun game. Even though it can’t solve the  problem, I still hope students can aware that group work is important. Be a good partner is important. My definition of interaction is to let players willingly and enjoyable keep respond to different process in my project making the process continuously. The interaction in my project is consisted of two part, one is the interaction between players, the other is the interaction between the players and my project. The interaction between players is that they interact with each other to finish the game. But in the version1 of my project, the interaction is not that good where I used words to do the instruction and ending. The process is not continuous, the boring words is not connected with the players and they are unwilling to further interact with the project. To enhance the interaction between my project and the players, I made the video to create a mysterious atmosphere so they willing to participate and give responses back and forth. If I have more time, I will break the project into different level. For example, level1 is to let each user learn how to control there own square which not only make the game easier for some players but also enhancing the concept that you need to control your own work do your own work in the project. Then in level2, letting them come together enhancing the concept of cooperation and communication. I can also set different levels, changing the shape of the squares, adding accelerometer to make the game more difficult to fit the demand of other players so all players can enjoy the game. I still hope by playing this game, there will be less quarrels between partners and every student enjoy the team work. I also learn that it is difficult to do the funny game and conveying the serious concept at the same time. When next time I do project, I need to think of how to make a project responding to different interaction from the player. Not only making the players enjoy playing, but also think about making good interaction to their responses to the serious concept.

References:

Ben Fry and Casey Reas (2001)Processing example bounce[Source code].https://processing.org/examples/bounce.html

Ben Fry and Casey Reas (2001)Processing example collision[Source code].https://processing.org/examples/bounce.html

Ben Fry and Casey Reas (2001)Processing example constrain[Source code].https://processing.org/examples/bounce.html

“Photo Booth Apps.” Simple Booth, 2019, www.simplebooth.com/products/apps.

Code
// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
import processing.sound.*;
import processing.video.*;
Movie myMovie1;
Movie myMovie2;
String myString = null;
Serial myPort;


int NUM_OF_VALUES = 4;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;   

int x = 0;
int y = 0;
PImage img1;
PImage img2;
float x1=0;
float y1=650;
float x2=650;
float y2=300;
SoundFile sound;

float easing = 0.0000000000001;

boolean stage1 = false;
boolean stage2 = false; 
boolean stage3 = true;
boolean stage4 = false;
boolean soundIsPlaying=false;

long gameStartTime = 0;

void setup() {
  size(650, 650);
  
  setupSerial();
  img1 = loadImage("red.png");
  img2 = loadImage("blue.png");
  background(255);
       sound = new SoundFile(this, "cello-f1.aif");
myMovie1 = new Movie(this, "introduction.mov");
  myMovie1.play();
myMovie2 = new Movie(this, "ending.mov");
myMovie2.play();
}


void draw() {
  updateSerial();
  printArray(sensorValues);

  background(255);
  //rect(x, y, 25,25);
  //ellipse(25,80,50,50);
  //rect(100, 100, 100, 100);
  //x1= map(sensorValues[1],74,102,0,width);

  float targetX2 = constrain(map(sensorValues[0], 97, 110, 0, width), 0, width-100);
  float dX2 = targetX2 - x2;
  x2 += dX2 * easing;
  x2 = constrain(x2, 0, width-100);

  float targetX1 = constrain(map(sensorValues[1], 0, 50, 0, width), 0, width-100);
  float dX1 = targetX1 - x1;
  x1 += dX1 * easing;
  x1 = constrain(x1, 0, width-100);

  
  float targetY2 = constrain(map(sensorValues[2], 40, 216, 0, height), 0, height-100);
  float dY2 = targetY2 - y2;
  y2 += dY2 * easing;
  y2 = constrain(y2, 0, height-100);
  //println(sensorValues[2]);
  //println(x2);
  //println(sensorValues[0]);
   float targetY1 = constrain(map(sensorValues[2], 40, 216, 0, height), 0, height-100);
  float dY1 = targetY1 - y1;
  y1 += dY1 * easing;
  y1 = constrain(y1, 0, height-100);
  
  if( stage3==true){
    //background(255);
    //textSize(20);
  //textAlign(CENTER, CENTER);
   //fill(255, 0, 0); 
   //fill(255, 128, 0);
 //text("Square Blue for left player, Square Red for Right Player",325,25);
 //text("Rule1:MOVE YOUR KNEE!\nTO FIGHURE OUT HOW TO MOVE",325,75);
 //text("TRY STRANGE MOVEMENT Using Your KNEE!",325,125);
  //fill(76, 153, 0);
  //text("Rule2:RAISE OR BEND THE LEG !",325,160);
  //text("TRY!",325,195);
  //fill(172, 0, 255);
  //text("Rule3:THESE ARE MAGIC SQUARES ONLY WORK IN MAGIC RANGE!",325,240);
  //text("FIND THAT RANGE TO CONTROL THEM!",325,280);
  //text("OR THEY WILL JOKE YOU WITH WIERD MOVEMENTS",325,320);
  //textSize(30);
  //fill(255, 0, 0);
  //text("Rule4:DON'T TOUCH ANY OBJECT!!!",325,380);
  //text("TIPS:FIND YOUR OWN PATTERN TO",325,443);
  //text("CONTROL THE SQUARES!!!",325,500);
  //text("MAKE IT MOVE THE WAY YOU WANT IT TO!",325,550);
  //text("DON'T LET IT'S MOVEMENT CONFUSE YOU",325,590);
  //text("CONTROL IT!",325,630);
//fill(255, 128, 0);
if (myMovie1.available()) {
    myMovie1.read();
  }
 if(mousePressed){
   stage1=true;
   stage3=false;
   gameStartTime = millis();
 }
}

  if (stage1 == true) {
    image(img1, x1, y1);
    image(img2, x2, y2);

    if (checkcollision(x1, y1, x2, y2, 30) && millis() - gameStartTime > 15*1000) {
      
      stage1 = false;
      stage2 = true;
    }
  }

  if (stage2 == true) {
     if(soundIsPlaying==false){
       
     sound.play();
     soundIsPlaying=true;
     }
    background(255);
  //textSize(20);
//textAlign(CENTER, CENTER);

   //fill(255, 0, 0); 
 //text("Teamwork: Facing difficulties and how to cooperate\n Sometimes difficult sometimes easy",325,25);

//fill(255, 128, 0);
//text("Overcome difficulties and enjoy the easy\ntry your //best and help your partner",325,95);
//fill(76, 153, 0);
//text("Don't let your partner do all the work \nDon't just stay there and he/she sweating all the time",320,165);
//fill(172, 0, 255);
//text("Don't think you can stay there doing nothing\nthe square will move randomly when you don't control",320,232);
 //fill(255, 0, 0);
//text("Why I design this game",320,283);
//fill(0, 128, 255);
//text("explore how to control up down left right\n= explore right way to control project",320,330);
//fill(172, 0, 255);
//text("limit to a certain range\n=find the way(right range) to do your project",320,390);
//fill(153, 0, 153);
//fill(76, 153, 0);
//text("ADJUST POSITION TO MEET=NEGOCIATE IN WORK\nStreching and hardness=difficulties in the project",320,460);

//fill(255, 128, 0);
//text("DON'T TOUCH OBJECT=IF YOU LOSE BALANCE\n ONLY CAN hold partners'hands=SUPPORT COOPERATION",320,535);
  //fill(255, 0, 0);
  //text("No rules find individual pattern=different situation individual faces",325,590);
//text("The stretch of Body=The strech of mind",320,625);
if (myMovie2.available()) {
myMovie2.read();
}
  }



  if (sensorValues[0]>40) {
    x2 += 10;
  }
  if (sensorValues[0]<40) {
    x2 -= 30;
  }
  if (sensorValues[1]<5) {
    x1 -= 30;
  }
  if (sensorValues[1]>5) {
    x1 += 10;
  }
  if (sensorValues[2]<120 && sensorValues[2] > 90) {
    y2 += 23;
  }
  if (sensorValues[2]<90) {
    y2 -= 10;
    }
  if (sensorValues[3]<150) {
    y1 -= 10;
   
  }
  if (sensorValues[3]>150) {
    y1 += 10;
  }
    //println("y2",y2);
  }
  //println(x2, y2);
  //} if (sensor2>100) {
  //  y2 += 10;
  //}

boolean checkcollision(float x1, float y1, float x2, float y2, float d) {
  //
  //return x2-x1 <=30 && y2-y1 <=30;
    return x1-x2==100 && y2-y1==0;
    //red.png

    
}
boolean checkangle(float angle1, float angle2) {
  return angle1==20 && angle2==30;
}



void setupSerial() {
  //printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 1 ], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Puppet-Leah Bian-Eric

Project name: Puppet

Conception and Design

The final goal of our project was decided at the beginning, which is to show the theme of “social expectation”. The original thought was to let the user play the role of the various forces in the society that force us to behave in a certain way, and the puppet is the representation of ourselves, who are being controlled. Therefore, the original plan was to let the user set a pose for the puppet in Processing, and thus data of the pose would be sent to the Arduino part. The real puppet on the miniature stage would first change several random poses, and finally stay at the pose that the user set before. In the first week of the process, we started to prepare the materials needed for the project with the original plan in mind. The most important part in our project is the puppet. We tried to search for one that is not so funny or childish to make our theme more distinctive, and finally decided to buy the vintage puppet that has 30 years history. We expected that the final stage may have a quite large size. If we use laser cutting to build the stage, then the materials may be insufficient. Therefore, we finally decided to use cartons as replacement. To add some dramatic atmosphere of our stage, we bought some velvet, expecting to stick them to the stage surface. In addition, we bought a mini tracker lamp to be attached to the top of the stage. For the Arduino part, we decided to use four servos connected with strings to rotate the arms and legs of the puppet. To make it more stable, we decided to use 3D printing to print some components and stick them to the servos with hot glue. In addition, we used some red velvet to make the stage curtain. Since it requires professional skills, we sent the velvet to a tailor shop, and finally got a satisfying result.

0

0

0

0

Fabrication and Production

To create the image of the puppet in Processing, I tried to draw a cartoon version of the puppet by coding at the beginning. But I finally gave up since it was too complicated and the final result may even not be satisfying due to the limitation of coding. Therefore, I decided to draw the image in a digital painting app name as Procreate. I can draw different parts of the puppet’s body in different layers of the painting screen, and thus we can directly load the images into Processing and rotate them. We first chose to use keyboard and mouse interaction to let the user control the movement of the digital puppet, and we finally finished the code. However, when we shared our thoughts with the IMA fellows, they pointed out that it would be hard for the users to see our theme of social expectation with such a simple process. Besides, it may not make sense to control the puppet with Processing instead of directly controlling it. The digital puppet and the physical puppet are presented to the user at the same time, and it looks a bit competitive. From our own perspective, we also felt that the interaction in our project was a bit weak, and the theme seemed to be vague. Therefore, we modified the plan. We planned to make our stage curtain an automatic one. We could use the motor to twine the string connected to the front of the curtain, thus opening it. Besides, I changed the color of the image in Processing to black and white tone. We could cast it on the wall with projector and it would look a huge shadow hanging over the real puppet.

    However, our plan changed again after user testing. Professor Marcela also pointed out the problem that our theme seemed to be very vague to her, and we also shared our worries with her. She gave us several valuable suggestions. She suggested us to use the cross, which is part of the real puppet, to let the user control the movement of the puppet directly. Besides, she suggested that we could use webcam to capture the user’s face, and finally put their faces in the head of the digital puppet, so the logic could be clear that the user is actually also being controlled. In addition, we also received a suggestion that we can add some voice of the puppet, to let it say something. These suggestions were extremely precious to us, and we started to change almost everything of our project after user test.

     First of all, we asked the fellows which sensor we can use to control the movement of the puppet directly. They suggested that we can use the accelerometer. The angle of the rise and fall of the puppet’s arms and legs would change with the angle that the user leans the cross. In addition, since it is hard to capture the users’ face when they are moving, Professor Eric suggested us to take a picture of the user at the beginning. He helped us with the coding and finally we made it like a process of asking them to take a selfie. I wrote a script and recorded my voice to be the puppet’s voice. The lines include, “What should I do?”, “What if they will be disappointed at me?”, “What if I cannot fit in?”. The last sentence is, “Hey friend, do you know, you are just like me.” After this last sentence, the image that the user’s face is in the head of the digital puppet will be shown to the user, so that we can show the logic that while we are controlling others, we are also being controlled. However, there were some problems with the Arduino part. The night before presentation, we were testing the accelerometer, hoping that everything could work well. However, we could not even find the port connected to the computer. Besides, in our previous testing, we found that the accelerometer is quite unstable and sensitive, making it hard to control the movement of the real puppet. Professor Marcela suggested us to change the accelerometer to tilt sensors, which are more stable. We took this advice and changed the code again. Tilt sensor functions as a button, if we lean it, a certain behavior could be triggered. In our case, we used two tilt sensors to control the movement of the arms and legs respectively. And the logic is, if the left arm rises up, the right arm would fall down, vice versa. Since tilt sensor only has a function as on or off, it is also easier for us to send the data to Processing. The digital image in Processing would change with the real puppet, following its poses. After we got everything done, I made a poster, on which I wrote the instructions and also the explanation of our project theme.

0

0

0

0

0

Conclusions

   Our project aligns well with my definition of interaction. In my preparatory research and analysis, I wrote my personal definition of a successful interaction experience. In my opinion, the process of interaction should be clear to the users, so that they can get a basic sense of what they should do to interact with the device. Various means of expression can be involved, such as visuals and audios. The experience could be thought-provoking, which may reflect the facts in the real life. My partner and I have created a small device as a game in our midterm project, so this time we decided to create an artistic one as a challenge. Our project aims at those who intentionally or compulsively cater to the social roles imposed on them by the forces in the society. We showed the logic that while we are controlling others while also being controlled by the others. In fact, it is hard to show a theme in an interactive artistic installation, and it was hard for us to find the delicate balance, the balance that we can trigger the thoughts of the user without making everything too heavy. The visual effect of our project is satisfying, and we also use music and voices to add more means of expression. The user’s interaction with our project is direct and clear. Instead of touching the cold buttons on the keyboard, they can hold the cross, listen to the monologue of the puppet, and thus build an invisible relation of empathy with the real puppet. After the final presentation, we have also received several precious suggestions. If we have more time, we would probably try to make the whole interactive process longer with more means of interaction, so that the user can be given more time to think more deeply about the theme. There are many ways to show our theme, but the results could be entirely different. We are given possibilities but may also get lost. The most important thing that I have learnt in this experience is to always be clear about what I am trying to convey and what the goals are at the beginning. Without a clear theme in mind, we are likely to lose directions, and the final work could be a mixture of various disordered ideas.

Video of the whole interactive experience:

Arduino Code:

#include <Servo.h>
Servo myservo1;
Servo myservo2;
Servo myservo3;
Servo myservo4;// create servo object to control a servo

int angleArm = 0;
int angleLeg = 0;
const int tiltPin1 = 2;
const int tiltPin2 = 4;
int tiltState1 = 0;
int tiltState2 = 0;

void setup() {
Serial.begin(9600);
myservo1.attach(3);
myservo2.attach(5);
myservo3.attach(6);
myservo4.attach(9);
pinMode(tiltPin1, INPUT);
pinMode(tiltPin2, INPUT);
}


void loop() {
//reasonable delay
delay(250);

tiltState1 = digitalRead(tiltPin1);
tiltState2 = digitalRead(tiltPin2);

if (tiltState1 == HIGH) {
angleArm = 90;
} else {
angleArm = -90;
}

if (tiltState2 == HIGH) {
angleLeg = 30;
} else {
angleLeg = -30;
}

// Serial.println(angleArm);
// Serial.println(angleLeg);

myservo1.write(90 + angleArm);
myservo2.write(90 - angleArm);
myservo3.write(90 + angleLeg);
myservo4.write(90 - angleLeg);

Serial.print(angleArm);
Serial.print(","); // put comma between sensor values
Serial.print(angleLeg);


Serial.println(); // add linefeed after sending the last sensor value
delay(100);
}


Processing Code

 
import processing.sound.*;
SoundFile sound;
SoundFile sound1;
import processing.video.*; 
Capture cam;
PImage cutout = new PImage(160, 190);
import processing.serial.*;
String myString = null;
Serial myPort;
int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/

PImage background;
PImage body;
PImage arml;
PImage armr;
PImage stringlr;
PImage stringar;
PImage stringal;
PImage legl;
PImage stringll;
PImage legr;
float yal=100;
float yll=0;
float yar=0;
float ylr=0;
float leftangle=PI/4;
float rightangle=-PI/4;
float leftleg = 570;
float rightleg = 570;
float armLerp = 0.22;
float legLerp = 0.22;
float pointleftx =-110;
float pointlefty =148;
PImage body2;
boolean playSound = true;
void setup() {
  size(850, 920);
  setupSerial();
  cam = new Capture(this, 640, 480);
  cam.start(); 
  background = loadImage("background.png");
  body=loadImage("body.png");
  arml=loadImage("arml.png");
  stringal=loadImage("stringal.png");
  armr=loadImage("armr.png");
  legl=loadImage("legl.png");
  stringll=loadImage("stringll.png");
  legr=loadImage("legr.png");
  stringar=loadImage("stringar.png");
  stringlr=loadImage("stringlr.png");
  body2 =loadImage("body2.png");
  sound = new SoundFile(this, "voice.mp3");
  sound1 = new SoundFile(this, "bgm.mp3");
  sound1.play();
  sound1.amp(0.3);  
}

void draw() {
  updateSerial();
  printArray(sensorValues);
  if (millis()<15000) {
    if (cam.available()) { 
      cam.read();
    } 
    imageMode(CENTER);

    int xOffset = 220;
    int yOffset = 40;

    for (int x=0; x<cutout.width; x++) {
      for (int y=0; y<cutout.height; y++) {
        color c = cam.get(x+xOffset, y+yOffset);
        cutout.set(x, y, c);
      }
    }
    background(0);
    image(cutout, width/2, height/2);
    fill(255);
    textSize(30);
    textAlign(CENTER);
    text("Place your face in the square", width/2, height-100);
    text(15 - (millis()/1000), width/2, height-50);
  } else { 
    if (!sound.isPlaying()) {
      // play the sound
      sound.play();
    } 
    imageMode(CORNER);
    image(background, 0, 0, width, height);
    image(legl, 325, leftleg, 140, 280);  
    image(legr, 435, rightleg, 85, 270);
    image(body, 0, 0, width, height);
    if (millis()<43000) {
      image(body, 0, 0, width, height);
    } else {
      image(cutout, 355, 95);
      image(body2, 0, 0, width, height);
 
      sound.amp(0);
    }
    arml();
    armr();
    //stringarmleft();
    image(stringal, 255, yal, 30, 470);
    image(stringll, 350, yll, 40, 600);
    image(stringar, 605, yar, 30, 475);
    image(stringlr, 475, ylr, 40, 600);
    int a = sensorValues[0];
    int b = sensorValues[1];
    float targetleftangle= PI/4 + radians(a/2);
     float targetrightangle= -PI/4 + radians(a/2);
     float targetleftleg= 570+b*1.6;
     float targetrightleg= 570-b*1.6;
     leftangle = lerp(leftangle, targetleftangle, armLerp);
     rightangle = lerp(rightangle, targetrightangle, armLerp);
     leftleg = lerp(leftleg, targetleftleg, legLerp);
     rightleg = lerp(rightleg, targetrightleg, legLerp);
     
float targetpointr = -100-a*1.1;
float targetpointl = -120+a*1.1;
float targetpointr1 = -50+b*1.3;
float targetpointr2 = -50-b*1.3;
yal= lerp(yal, targetpointr, armLerp);
yar = lerp(yar,targetpointl,armLerp);
yll= lerp(yll,targetpointr1,legLerp);
ylr = lerp(ylr,targetpointr2,legLerp);
  }
}

void arml() {
  pushMatrix();
  translate(375, 342);
  rotate(leftangle);
  image(arml, -145, -42, 190, 230);
  fill(255, 0, 0);
  noStroke();
  popMatrix();
}



void armr() {
  pushMatrix();
  translate(490, 345);
  rotate(rightangle);
  image(armr, -18, -30, 190, 200); 
  popMatrix();
}

void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 11 ], 9600);
  myPort.clear();
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}

void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Floating keys – Fay Li – Eric

CONCEPTION 

Our project was like a physical version of Piano Tiles and Guitar Hero. There were four boxes as four keys of the piano, each with a sensor inside. The player should follow the moving tiles on the screen and put his hand into the right boxes accordingly to reach a high score and complete the jingle bells melody. Compared to simply tapping the screen, we wanted players to have feel more engaged with the game through the physical process of interaction.

DESIGN,   FABRICATION, AND PRODUCTION:

Our original design for the physical part was to put four small boxes in one big box, to make all the circuits and keys an integral part. But it was thus rejected by the TA the first time we went to fab lab, because it would use more than ten boards of wood, which was a lot. Later we decided only to make four small boxes with a big platform (built by two boards), which altogether would take six boards. And the idea was rejected again as six was still too much. Therefore, we had to give up the idea of using the wood board as the platform. We found a box in the cardboard room and cut it to the proper size and glued the boxes onto the board, which turned out working well though looked kind of weird.

We left a small hole at the back of each box so the wires could come out more organized. I also added some decorations onto the top of the boxes to show the theme of music, and add some Christmas vibe to the project.

We also met some problems with the Arduino when building the circuit, though it was supposed to be the easiest part. We found that when connecting the Arduino to the computer, it didn’t show up in the port list, and the green light turned off, and Arduino became extremely hot. Later we found out that it was because we misconnected some of the wires. And the same mistake occurred several times throughout our production and even right before the presentation, and each time it took us long enough to figure out.

The coding was the hardest part of the project. We were able to complete the code with help from professors, TAs and many friends. Especially thanks to my partner James, who put a lot of effort into this. The project wasn’t fully functioning during the user test due to the same mistake I mentioned above and we didn’t figure it out before the user test, so we used four keys on the keyboard to replace the sensor to only test the code. We didn’t get many useful suggestions during the session, but Eric helped changed the code to make it work better.

Later we added more details to the game interface, such as the title screen, time bar, highest score record. And the game could be restarted as well.

CONCLUSIONS:

Though we still ran into some problems the day our presentation, the project overall interacted with the player well. They interacted with it by placing their hands into the right box and complete the melody, which aligned with my definition of interaction as being a reciprocal and cyclical process between the user and the machine. It also reached the goal that making the users feel more engaged with the game through physical interaction. The problem was that once the melody was completely played, the array reached its end and the game would crash (Eric helped fixed it after the presentation). If given more time, we could improve it by adapting the suggestions we received during the presentation, such as making it a two-player game, or making a vertical version so that the keys can really “flow”, or adding more songs to it… The project experience was very interesting and meaningful, as a team we overcame many difficulties and finally had a good outcome. And hopefully, the attempt of adding more interaction and making a physical version of a game can bring more fun and engagement to the players.