Recitation 9: Media Controller by Haiyan Zhang

In this recitation, work individually to create a Processing sketch that controls media (images or video) by manipulating that media’s attributes using a physical controller made with Arduino. Reflect on this week’s classes to guide you in your coding process, and think about how you can incorporate interactivity and computation into this week’s exercise.

I chose one screenshot image from one of my favorite documentaries called Life is Fruity. (image attached below)

import processing.serial.*;
Serial myPort;
int valueFromArduino;

PImage pimage;

void setup() {
  size(1440, 824);
  background(0);
  pimage = loadImage("pimage.png");
  printArray(Serial.list());
  // this prints out the list of all available serial ports on your computer.

  myPort = new Serial(this, Serial.list()[ 5 ], 9600);
}


void draw() {
  // to read the value from the Arduino
 image(pimage,0,0,width,height);
  while ( myPort.available() > 0) {
    valueFromArduino = myPort.read();
  }
  if (valueFromArduino == 1){
  filter(BLUR,10);
 }
  println(valueFromArduino);//This prints out the values from Arduino
}

[code] int button = 6;

void setup() {
// put your setup code here, to run once:
Serial.begin(9600);
}

void loop() {
// put your main code here, to run repeatedly:
int buttonstate = digitalRead(button) ;
Serial.write(buttonstate);

// too fast communication might cause some latency in Processing
// this delay resolves the issue.
delay(10);
}
[/code]

Above are respectively processing sketch and arduino sketch for this exercise. Basically, I used a button for the physical interaction to trigger the BLUR filter over the original image in processing. The video below shows the effect.

Document your work on your blog. Use this week’s reading, Computer Vision for Artist and Designers, to inspire you to write a reflection about the ways technology was used in your project.

The documentary itself is about two elderly people sharing their life stories in the countryside to the audience who are interested in and eager for encouraging power and wisdom from their life experiences. At the end of the documentary, as an audience, I witnessed death as well as life. Therefore I chose this image in which two of them are holding each other – showing trust and love towards each other. I want to trigger the blurry effect over the image to mimic the sight when someone is in tears, to earn a blurry sense of this life fruit. 

Strike!-Barry Wang-Inmi

Finally, this semester’s journey has come to an end. With the final project done, I am now writing this blog post to record the process of the whole project in the meanwhile try to reflect and synthesize the achievements and flaws to be improved.

Part I. CONCEPTION AND DESIGN

When started thinking about our final project, we have defined that the experience we would like to create should involve body interaction rather than just pressing down keys and buttons. As Joseph and me are both game players, so we decided to bring a game with body interaction controlling. We chose a retro aircraft war game, because its classic and easy to be understood. We thought that it would be more interesting while adding some difficulty if the user can open up their arms and control the aircraft with their body by tilting left and right. Just like the way aircraft banks. Thus, we decided to use accelerometer to detect the motion of the user and mapping them into the game. Besides, we think that such a body interaction game should completely get rid of touching the computer, so that the users don’t have to press the keys or move the mouse while wearing devices with long cables. Thus, we decided to make our game, all the menus and selection controlled by body interaction too. On the material side, since we modelled a retro game, we would also like to recreate and arcade game box to bring some sort of retro feeling. So, we designed the box that fits the test computer, and laser cut it out using wood plank. To be frank, using a projector might be simply a better choice, since the interaction involving body should be displayed on a huge screen. Using a computer gives a feeling of unbalance while the user has to stare at a small screen while standing at a distance from it. We did not think about this point, this is an idea that definitely worth improving. Besides, on the material of wearable devices, there were several choices, like gloves, bands, and even clothes. Finally, we decided to use bands. Just like the way people wear a watch. In one way, a light band does not add any extra unnecessary weight to the device. In another way, using a band can make sure to the largest extent that our sensor gets the correct value. Since most people wear a watch in the same way, that makes it much easier when detecting the motion.

Part II. FABRICATION AND PRODUCTION

In the process, there were a lot of achievements that we had never done before, but also, I have to say that there were simply a lot of difficulties that were completely beyond our imagination. I would like to start with the successes. First of all, is the most basic game elements and aircraft controlling. The basic component of the game includes music effects, images, and game logic. The music effects and images are relatively easy to solve. The game logic, which includes collision detection, grade and level, multi forms of enemies take some time. For the collision detection, we set a radius for each enemy aircraft, adjust them until they are accurate to the best. For the level system, we let the aircraft adjust the bullet images and amounts according to the current score. Though they are not difficult, but they are indeed time-consuming. For the controlling part, we used serial communication between Arduino and Processing. But one problem is that, the accelerometer is actually quite sensitive, causing the aircraft flickering on the screen. Also, using an accelerometer means that there might be a possibility that when user make severe movement, the acceleration value might be too big for the map function. Thus, we define a threshold that, when the value of accelerometer changes minutely, the signal does not send, and while the value change drastically, we let the value change slowly and gradually and finally to the point where the user wants to be. All these mechanisms work invisibly behind the screen, but it takes a lot of time and effort to tuning and choosing different threshold values to make sure the aircraft is stable on the screen so as to create a better user experience. Another important achievement we make is the body-controlled cursor system. It moves the cursor on the screen by moving the arm up and down by the user. This important system makes our ideal, free of keyboard and mouse controlling mode come true. To be specific, the mechanism is that, the user moves the cursor by body, stay on an item for three seconds, boom, the item is selected. The latency, in our case, is three seconds. This is set to make sure the user only gets into the item that he or she wants to get in. To give the users clear information, we have to use an indicator to show that the system is running, don’t worry. We use an empty circle, that completes itself a third every one second to indicate the cursor status. Once the circle completes, the system chooses the item where the cursor stays at. This mechanism is also user friendly to avoid users moving and clicking while dragging long cables. Besides, we have the well-cut case that is done completely by Joseph. I know it is not easy to make those finger connections between boards, especially for a not rectangle box. But he finally did it, making the computer looks a small arcade. Also, I would like to express my gratitude to him here. I believe all of these achievements are what we can be proud of.

Next comes to the difficulties. The biggest one is the I2C communication. We planned to have a two-player mode, which obviously requires two sensors. However, the two accelerometers have the same physical I2C address. It is basically impossible (or just impossible for my level) to communicate without an I2C shield or a multiplexer on a Arduino with one SDA&SCL pin. We tried different ways but did not get improvements. This is why we were showing up in user test session with only one sensor.

Speaking of user test, the trouble of using one sensor immediately reveals. The users only use one arm to control, which is not the way that we expected it to be. Using one arm does not feel like an aircraft at all. Besides, some indicates the instructions could be clearer and more new playing method can be involved. But still, most of our feedback are positive. A lot of users think it is cool and they like it.

User Test Session Videos

After the user test session, we made according improvements. First, the sensor problem. We finally choose to use two Arduino boards with one sensor on each. And to utilize this sensor, we develop another advanced mode, where user can control the aircraft vertically. Though this mode is extremely difficult which really requires coordination between arms since they are not moving in the same way. At least, with two sensors, the users start to use both arms, which creates that feeling of flying. Besides, we improved our instruction to make it as clear as possible. These improvements are quite effective, since a lot of users, enjoy our game in the IMA show.

User Playing During IMA Show

Part III. CONCLUSION

Reflecting back to our definition of interaction. We defined it as a highly user-involving, body interaction. I think our project has tried to align with it as much as possible. In order to make the user concentrate and being involved, we used game as the carrier of the interaction. In order to make body interaction, we try to model a motion controlling experience. The users seem to enjoy it. They play with their arms open, trying their best to hit every enemy plane. Some users tried the advance mode, alone or cooperate with friends. It’s difficult, but they play with smile on their face. We have developed a leaderboard, and the users are really competing with each other in the IMA show. A lot of users take pictures or record a video of him/herself enjoying the game. Another user who reached the highest score, (really high score) took a photo of the leaderboard. All these makes us happy. But this is not a perfect project. There are flaws that we need to improve. Using two simple sensors are definitely not the best way, we will try to use Kinect in the future projects if possible. Also, we will try to utilize the code so that the game runs more smoothly. We can also definitely make this game complete rather than playable by now. Moreover, we can definitely create more new elements into this game or adapt the body control concept to another application to create a who new experience. But on this point, I still want to say that, being new does not mean a completely new way of controlling for me. The application of the body control idea on this game is new because not much people tried to do so and not much player enjoyed this new application. It is because of new that a lot of players would enjoy our game. However, the keeping the idea of being new is always good and enlightening point in any interactive developing process. We will persist in this idea and try to do better in the future.

Finally, to wrap the whole thing up, to make a game is to make the gamers enjoy it. That is why we choose to make a game. And it is the ultimate goal that this project wants to bring. This is the significance we are doing this. The most important thing from this project, or even from this class is to always keep trying. A lot of our successes are resulted from some accidental trials. And keep moving, no matter what the failure is.

Thank you, to the best professor Inmi, my partner Joseph, all my classmates, and the people who are reading this page.

Final Project-clover

My project: Cooperate! Partner!

Concept: I get my idea of this project from a conversation I had with my friend A She was also an IMA student working on another project. She looked pale and complaint about her partner. This is not the first time I hear complains about partner. I also had this feeling when I did my first project in Communication Lab. It was a project needed to be done by two people but only one people finish the whole thing. It makes me feel worse when I saw other team working happily and enjoy the process of teamwork. Other team spare the work and two people both engage in the working process. Not only me and friend A have this feeling, when I further carry out the research, I found that many IMA students also had bad experience when they do group project. During the mid-term and the final, there are always quarrels between partners and complaints about partners. This makes me think that, can I use a game to improve group work experience letting people enjoy working with their teammate. Then I began my research on what lead to the bad experience. I interviewed some students in the studio. Friend B: “My partner never listened to me. This makes me really angry. We could have done better if he listened and adapt some of my suggestions”. Friend C: “I always do most of the work, my partner did nothing. I feel unfair”. Friend D: “I didn’t participate that much as my partner expected because I really don’t know she needs me, I think she is doing quite well on her own”.To sum up, I found the following things lead result in bad experience in teamwork. First is the free-rider problem which is only one person do the work and the other paying not effort. This make the person doing the entire work feels unfair. Second is the communication problem. Partners lack communication and corporation. Each stick to own ideas and doesn’t cooperate. The first bad result of lacking communication is the created project often not satisfying. The second bad result is people feel they are not being understood by their partners which often leads to quarrels and emotional breakdown. The third bad result is if you don’t tell your partner you need help, he/she may not know you need help and he/she couldn’t offer you help. So I plan to design a game which force people to engage, force them to cooperate with their teammates while at the same time still enjoy playing this game. Idea 1 is to use an accelerometer to let the user1 rotate one piece of the puzzle and user2 rotate another piece then by finding the right angle letting two pieces gather together to finish the puzzle. I plan to use the distance sensor to let the user move the piece up and down. By letting each player rotate to find the appropriate angle and control the height of the piece, I plan to make each of them engage in the game which equals to participate and do your own part of job in the teamwork. Then by letting them adjust the height and the angle to form the puzzle, I plan to make them adapt to the changes of their teammate and cooperate to form a puzzle which equals to the communicate and cooperation in the teamwork. I also set the player to use both their hand and leg to force them lose balance so they need to reach their partner for help which also represents the communication part in the project.

(idea 1)

(idea2)

Then I did user test. I found that users can still keep balance quite well when they use their hands to rotate and raise their legs at the same time. They also said the game is too easy and should be more difficult. In order to let them losing balance enhancing the cooperation and communication, I change to idea 2 . This time, I let the users bend or straight their knee to control the left and right of the square using flexible sensor. I keep the distance sensor so the user have to also change the height of the square at the same time. To enhance the concept that both need to engage in the game and do their own work, control their own work, I make the square more difficult for the user to control. By setting x and y adding different numbers(x1 -= 30; x1 += 10; y1 += 23; y1 -= 10;) I want the user to keep a certain position for a longer time which is more difficult to control their body linking to the point that you need to make good control of your own part when working on the project. Also by adding this difficulty in controlling balance, I force them to reach to their partner for help enhancing the communication. I also set the sensor value to a certain point, that only above this level, the square go higher, if you don’t reach this height, even you raise your knee, the square will still not go up. I set this because, to make a good project, you need to pay effort, you need to reach a creation level. I transforming the mind work, the mental stress into physical body move, body stress. I hope by doing this, the player can move their body doing some exercise to relief mental stress, while at the same time, partners can have a stronger feeling that their partner is stretching, they really need your help and your participation, letting both players feel the importance of teamwork and cooperation. If you don’t cooperate participate and go to the top of the canvas, your partner have to pay a lot of effort to reach you and complete the game(project). Mental stress may be hard to see but body sweating is really obvious. I hope player can have the sense that if they don’t participate and cooperate, their partners have to keep raising their leg which is really painful. I hope they can feel the importance of participation and cooperation. I also don’t want to give only one absolute  solution to this game because each person is different, the team is also different, their way of doing project is different, their solutions are also different. Different teams can find their own suitable, easy approach solution to gather together. But when doing the user test, there is actually a easiest and energy saving solution which is gather together at the bottom of the canvas no need to raise the leg that height.

(the easiest solution)

I set this also linking back to the range setting above. It also means that if you find the easiest way of doing the project, you may not pay that much energy to achieve that level but you can still achieve the same goal gathering together. But this solution needs player to explore to find out. Just like doing project, there are many ways to complete the work, but you need to explore to find the easiest way to do that or you may pay a lot more effort to reach the same goal. 

The sensor I used and the code

(the flexible sensor on the knee controlling left and right of the square)

(the distance sensor sensing the distance to the ground)

(tape up to get accurate statistic)

User test:Too much cables, easy to tripped and looks ugly. So I tape the cable together to prevent user form being tripped and dropped while also make it looks better.

User test, they said the Arduino and the cable is ugly so I later cut a box to hind the Arduino and breadboard to make it looks beautiful

(use this equipment make the user easier to put it on the knee)

The lay out:I set the background into completely white adding no other design because I prototype some background with cool effect but the users said the background setting confuse them, the background made them hard to focus on the movement of the square soI set the background into completely white to let the user focus on the movement of the square. I also set the squares’ color into red and blue which is easier to recognize. And by adding the sentence “come together” this is like the word puzzle letting the user know they need to come together to form the phrase. I also set the blue square controlling by the left player on left corner so the player have a clear understanding of which square is controlled by him. The red square start from the right down corner so the player on the right know he is playing the red one. Also changing from the user test feedback, some team said the game before which both start from the left up corner is too easy, so I add the distance between them, to make the player do more movement to gather together.

(before start here)

(now start here)

Also according to the user test feedback, they suggested I should set up a rules so they know the knee is controlling the left and right, which square is for which, also you can only touch your partner to keep balance, so I add a instruction before the game. They also ask the meaning of this game, suggested that I should add a explanation at the end. I made both of them colorful to add the visual effect(feeling).

I also add a piece of relaxing music at when the game is finish because 1. Some user said they need a piece of music to relax after the body stretching which is difficult and energy consuming. 2. They need a clear sign to let them know they finish the game to cheer up.

(so I add this cheerful and happy music at the end along with the instruction).

User test: Some user wear thick trousers that affect the distance sensor. I 3-D printing this pole adding on the leg equipment to make the distance sensor sense the distance accurately.

Also after the user test in class, I change the words into video to make it fun and more connected with the game.

The introduction video:

At the beginning, I used the black and white to make the game looks mysterious. I also hide my face to raise players’ interests to play this game, make the interaction continue to the game.(using photo booth, citing in reference).

The link for the video: https://drive.google.com/open?id=18HRZsgneyYCVyAYbskO4H3m__SwozWQc

(now)

(before)

The ending video:

At the end I made the video colorful to cheer up and hope the player feel relaxing. I hope players find happy that the world suddenly become colorful when they finish the game. Making them satisfied with their complement of the game.

The link to the video: https://drive.google.com/open?id=1P5Yadwg5hcV5hcGjtE5GGbe-GT3kq4PJ

(now)

(before)

During the user test, as shown in the pictures, I changed some design to make the sensors works better, to let the user know how to play without my explanation and convey the meaning behind this project. But there are also problems that I didn’t solve. First is taking the user words’ for example. Leah and Eric(together): I like this game because the physical interaction is funny and the concept is good. But I wonder can the game be more difficult and have more levels? Ryan: It is a fun game but it’s really difficult to play. The first failure is I couldn’t find a appropriate difficult level for this game to make every user feel the game is fun and enjoy the difficult level. During the user test, for teams that worked before, it’s easier for them to complete this game, but for players who form up immediately, they feel difficult and energy consuming. Considering the concept, I plan to let everyone notice the importance of group work but no costing that much energy. However if I lower the difficult level, as a game, it’s less playful for other teams. The balance between conveying concept and keeping the play funny and playful is a part that I didn’t do well. Second is my project can’t really improve and change the teamwork situation. By playing this game, players feel fun and may aware of the problems by seeing the concept but when this game may doesn’t help much when they actually working on the project. The project raise awareness but don’t solve the problem.

I hope by playing this game, players be a better teammate and have better experience in group work. I hope I can let those who don’t have group work experience know what is group work and how can do a good group work. I hope I can achieve this goal by using a fun game. Even though it can’t solve the  problem, I still hope students can aware that group work is important. Be a good partner is important. My definition of interaction is to let players willingly and enjoyable keep respond to different process in my project making the process continuously. The interaction in my project is consisted of two part, one is the interaction between players, the other is the interaction between the players and my project. The interaction between players is that they interact with each other to finish the game. But in the version1 of my project, the interaction is not that good where I used words to do the instruction and ending. The process is not continuous, the boring words is not connected with the players and they are unwilling to further interact with the project. To enhance the interaction between my project and the players, I made the video to create a mysterious atmosphere so they willing to participate and give responses back and forth. If I have more time, I will break the project into different level. For example, level1 is to let each user learn how to control there own square which not only make the game easier for some players but also enhancing the concept that you need to control your own work do your own work in the project. Then in level2, letting them come together enhancing the concept of cooperation and communication. I can also set different levels, changing the shape of the squares, adding accelerometer to make the game more difficult to fit the demand of other players so all players can enjoy the game. I still hope by playing this game, there will be less quarrels between partners and every student enjoy the team work. I also learn that it is difficult to do the funny game and conveying the serious concept at the same time. When next time I do project, I need to think of how to make a project responding to different interaction from the player. Not only making the players enjoy playing, but also think about making good interaction to their responses to the serious concept.

References:

Ben Fry and Casey Reas (2001)Processing example bounce[Source code].https://processing.org/examples/bounce.html

Ben Fry and Casey Reas (2001)Processing example collision[Source code].https://processing.org/examples/bounce.html

Ben Fry and Casey Reas (2001)Processing example constrain[Source code].https://processing.org/examples/bounce.html

“Photo Booth Apps.” Simple Booth, 2019, www.simplebooth.com/products/apps.

Code
// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
import processing.sound.*;
import processing.video.*;
Movie myMovie1;
Movie myMovie2;
String myString = null;
Serial myPort;


int NUM_OF_VALUES = 4;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;   

int x = 0;
int y = 0;
PImage img1;
PImage img2;
float x1=0;
float y1=650;
float x2=650;
float y2=300;
SoundFile sound;

float easing = 0.0000000000001;

boolean stage1 = false;
boolean stage2 = false; 
boolean stage3 = true;
boolean stage4 = false;
boolean soundIsPlaying=false;

long gameStartTime = 0;

void setup() {
  size(650, 650);
  
  setupSerial();
  img1 = loadImage("red.png");
  img2 = loadImage("blue.png");
  background(255);
       sound = new SoundFile(this, "cello-f1.aif");
myMovie1 = new Movie(this, "introduction.mov");
  myMovie1.play();
myMovie2 = new Movie(this, "ending.mov");
myMovie2.play();
}


void draw() {
  updateSerial();
  printArray(sensorValues);

  background(255);
  //rect(x, y, 25,25);
  //ellipse(25,80,50,50);
  //rect(100, 100, 100, 100);
  //x1= map(sensorValues[1],74,102,0,width);

  float targetX2 = constrain(map(sensorValues[0], 97, 110, 0, width), 0, width-100);
  float dX2 = targetX2 - x2;
  x2 += dX2 * easing;
  x2 = constrain(x2, 0, width-100);

  float targetX1 = constrain(map(sensorValues[1], 0, 50, 0, width), 0, width-100);
  float dX1 = targetX1 - x1;
  x1 += dX1 * easing;
  x1 = constrain(x1, 0, width-100);

  
  float targetY2 = constrain(map(sensorValues[2], 40, 216, 0, height), 0, height-100);
  float dY2 = targetY2 - y2;
  y2 += dY2 * easing;
  y2 = constrain(y2, 0, height-100);
  //println(sensorValues[2]);
  //println(x2);
  //println(sensorValues[0]);
   float targetY1 = constrain(map(sensorValues[2], 40, 216, 0, height), 0, height-100);
  float dY1 = targetY1 - y1;
  y1 += dY1 * easing;
  y1 = constrain(y1, 0, height-100);
  
  if( stage3==true){
    //background(255);
    //textSize(20);
  //textAlign(CENTER, CENTER);
   //fill(255, 0, 0); 
   //fill(255, 128, 0);
 //text("Square Blue for left player, Square Red for Right Player",325,25);
 //text("Rule1:MOVE YOUR KNEE!\nTO FIGHURE OUT HOW TO MOVE",325,75);
 //text("TRY STRANGE MOVEMENT Using Your KNEE!",325,125);
  //fill(76, 153, 0);
  //text("Rule2:RAISE OR BEND THE LEG !",325,160);
  //text("TRY!",325,195);
  //fill(172, 0, 255);
  //text("Rule3:THESE ARE MAGIC SQUARES ONLY WORK IN MAGIC RANGE!",325,240);
  //text("FIND THAT RANGE TO CONTROL THEM!",325,280);
  //text("OR THEY WILL JOKE YOU WITH WIERD MOVEMENTS",325,320);
  //textSize(30);
  //fill(255, 0, 0);
  //text("Rule4:DON'T TOUCH ANY OBJECT!!!",325,380);
  //text("TIPS:FIND YOUR OWN PATTERN TO",325,443);
  //text("CONTROL THE SQUARES!!!",325,500);
  //text("MAKE IT MOVE THE WAY YOU WANT IT TO!",325,550);
  //text("DON'T LET IT'S MOVEMENT CONFUSE YOU",325,590);
  //text("CONTROL IT!",325,630);
//fill(255, 128, 0);
if (myMovie1.available()) {
    myMovie1.read();
  }
 if(mousePressed){
   stage1=true;
   stage3=false;
   gameStartTime = millis();
 }
}

  if (stage1 == true) {
    image(img1, x1, y1);
    image(img2, x2, y2);

    if (checkcollision(x1, y1, x2, y2, 30) && millis() - gameStartTime > 15*1000) {
      
      stage1 = false;
      stage2 = true;
    }
  }

  if (stage2 == true) {
     if(soundIsPlaying==false){
       
     sound.play();
     soundIsPlaying=true;
     }
    background(255);
  //textSize(20);
//textAlign(CENTER, CENTER);

   //fill(255, 0, 0); 
 //text("Teamwork: Facing difficulties and how to cooperate\n Sometimes difficult sometimes easy",325,25);

//fill(255, 128, 0);
//text("Overcome difficulties and enjoy the easy\ntry your //best and help your partner",325,95);
//fill(76, 153, 0);
//text("Don't let your partner do all the work \nDon't just stay there and he/she sweating all the time",320,165);
//fill(172, 0, 255);
//text("Don't think you can stay there doing nothing\nthe square will move randomly when you don't control",320,232);
 //fill(255, 0, 0);
//text("Why I design this game",320,283);
//fill(0, 128, 255);
//text("explore how to control up down left right\n= explore right way to control project",320,330);
//fill(172, 0, 255);
//text("limit to a certain range\n=find the way(right range) to do your project",320,390);
//fill(153, 0, 153);
//fill(76, 153, 0);
//text("ADJUST POSITION TO MEET=NEGOCIATE IN WORK\nStreching and hardness=difficulties in the project",320,460);

//fill(255, 128, 0);
//text("DON'T TOUCH OBJECT=IF YOU LOSE BALANCE\n ONLY CAN hold partners'hands=SUPPORT COOPERATION",320,535);
  //fill(255, 0, 0);
  //text("No rules find individual pattern=different situation individual faces",325,590);
//text("The stretch of Body=The strech of mind",320,625);
if (myMovie2.available()) {
myMovie2.read();
}
  }



  if (sensorValues[0]>40) {
    x2 += 10;
  }
  if (sensorValues[0]<40) {
    x2 -= 30;
  }
  if (sensorValues[1]<5) {
    x1 -= 30;
  }
  if (sensorValues[1]>5) {
    x1 += 10;
  }
  if (sensorValues[2]<120 && sensorValues[2] > 90) {
    y2 += 23;
  }
  if (sensorValues[2]<90) {
    y2 -= 10;
    }
  if (sensorValues[3]<150) {
    y1 -= 10;
   
  }
  if (sensorValues[3]>150) {
    y1 += 10;
  }
    //println("y2",y2);
  }
  //println(x2, y2);
  //} if (sensor2>100) {
  //  y2 += 10;
  //}

boolean checkcollision(float x1, float y1, float x2, float y2, float d) {
  //
  //return x2-x1 <=30 && y2-y1 <=30;
    return x1-x2==100 && y2-y1==0;
    //red.png

    
}
boolean checkangle(float angle1, float angle2) {
  return angle1==20 && angle2==30;
}



void setupSerial() {
  //printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 1 ], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Final Project Step 4: Final Blog Post by Min Jee (Lisa) Moon

Bullying Simulator – Min Jee (Lisa) Moon – Marcela Godoy

Because my project was more of a statement piece, there wasn’t much interaction going on. 

  1. in order to give a first-hand experience of getting bullied by others, I made the project to be on the phone, just like how the victims of the bullied would always receive bullying messages on the phone. 
  2. To resemble the similar terror inside the user’s mind, I added the background sound and the ringtone. (Frankly, I got really sick and terrified of the ringtone of the project ringtone). 
  3. Collected the real text messages that I found on SNS to make the situation realistic. 
  4. Tried to make the project code heavy than the physical part, so that the user can somewhat interact with it, the way users do with their real phone.

For the fabrication,

Initial physical outline for my project

Rastering the laser cutting material

Finished laser cut material

Just like above, I intended the paper role to fall right in front of the hole, into the trash bin, because I thought the bullied are treated like an “emotional trash bin” since all the bad feelings of the bullies would be thrown into bullied as if the bullied are trash bin. 

However, during the user testing session, all the users would not get the “emotional trash bin” idea. Due to the outcome from the user testing session, I realized I would have to change my physical part.

What I moved onto (the idea that I got from the user testing session from the test users) the idea of the messages (trash) thrown to the user. As a result, I worked towards the paper roll shooting the trash towards the user. 

The first prototype was the one using the DC motor. 

As you can see from the video, the stepper would move the push bar to feed the trash messages to the dc motor wheel. 

However, as you can see from the above video, the DC motor would throw the trash message to the right front. In addition, the DC motor would get too loud to even make the entire table vibrate. As a result, I decided to change to prototype 2. 

The shooter inside the device
Laser cutting the wheel to hold on to the strip/hook behind the paper shooter working like a trigger
The device without the feeder stepper (just for the better view inside the photo)

This is the shooter inside the prototype 2. The feeder from prototype 1 was kept the same. If the feeder feeds the trash messages inside the gun, the stepper inside the device would pull on the trigger behind the gun to shoot the trash message towards the user. However, because pulling on the material would be too hard. Therefore, I moved on to the last idea. 

For the last idea, I do not have an image or video, but the basic idea can be found here

Just like the video linked above, when the feeder feeds the trash message paper to the right place, the stepper wheel attached to the catapult would keep on pull on the material, and finally, the servo motor pushing down the catapult would release the catapult, making the trash message to be shot up at the user through the whole. 

However, the rubber pulling the catapult would be too weak, and the friction between the wooden boards between the catapult would be too strong to make the catapult shoot up the material effectively, failing the entire physical part’s intention. As a result, I, sadly, was forced to focus all my projects on my coded application part.  

HERE is the link to the code bit (The demo of this code bit can be seen from the video at the beginning of this post)

CONCLUSIONS:

This project was motivated by several videos about the suicide video of Amanda Todd and suicide notes of the 13 years old girl K (Korean) who decided to do suicide after continuous bullying. I collected a lot of the real bullying texts that the bullying victims got in their facebook messages, text messages, Kakaotalk (Korean WeChat), etc. to give a realistic experience. 

Because my initial and the main attempt was to give the user first-hand experience from the perspective of the bullied, I believe I was able to give a half-successful project. However, unlike my initial attempt of putting the users into the shoes of the bullied and the users tended to view the project in the perspective of the 3rd person’s point-of-view and since this program can only be run on my phone, I think it was still a project that has some space for improvement. 

In order to improve this project, I would love to get an approvement from our college and actually publish an actual application that actually sends the messages to the user’s actual phone, which would give more hands-on experience to the user. 

I believe this is a very meaningful project after-all because bullying is an ongoing issue all over the world. When I asked the NYU Shanghai students and instructors if they know who to contact if they encounter someone bullying others, nobody was able to answer the question. When I also asked if anybody knows how it feels to be bullied, nobody had a clear idea of how it would feel to be bullied by others. I believe this shows ignorance towards issues like this. Raising awareness to the NYU students and instructors (people whom I presented my project to) would satisfy my initial purpose. 

Final Project- The Smell of Home-Ruben Mayorga

The smell of home was an idea that came to me when thinking of my experience in children museums during my childhood, I remember that back home the kid’s museum had an exhibition that showcased different smells and textures and as a kid you had to identify the object you where smelling and touching. With this idea in mind, I tried to showcase a cultural experience of different countries and relate that to smell.  The project finally came to life being a table of dinner in which you had to smell the plate to have a memory recounted to you by the computer. The display simulated a dinner table because, I wanted the experience to be of a kind of dinner talk with plates of all over the world, while listening to memories that relate to this smells.

The Smell of Home had many changes during the creation of the project, firstly my original idea involved videos of people talking to the person and projected to the wall. Yet, various experts commented that it would be better to maintain the voice with no video to help imagine the face of the person in contrast to actually looking at them. Secondly, the first display was supposed to be a rectangular table looking at the wall, but was changed due to the change from video to audio. Finally, the other important change that was done in the project was the amount of chairs, in the first idea there were 5 chairs, one for each food. But, because of the confusion that this created changed to be only one chair that could move so only one person would use it at a time.

The production of the project was a bit complicated, setting up the table and making it look like an actual dinner table even with the cables around was difficult. Yet not impossible I used laser cut plates to put as decoration with printed images of the foods with actually food inside for the smell. After that, I placed the aduino inside a box to hide it and on top of that box a wine bottle for decoration. And also, to make the user feel like it was an actually dinner table and not only a IMA project. I used the forks and knives from the cafeteria to decorate and a table cloth to put the finishing touch.

The Code had its difficulties, because each sensor had to be attached to its individual video and had to be triggered only when the user was smelling. The other issue came due to the type of sensor that I was using, that specific sensor was different from the other sensor we had previously used. Because of this I had to use a different code to make it work with the arduino.

The user testing was a very useful method to get to know the problems that the project had, like the position of the sensors and the quantity of chairs in the display. The user testing also helped me to understand that the project needed a little more clarity regarding the way in with the project functioned. Nevertheless, I thought the project had very few and minor flaws that had to be changed. Overall the project was very well received by most of the participants due to it’s uniqueness and creativity.

Finally, the project reflected all the important aspects of the class, it included laser cut plates, processing for the audio and the arduino for the sensors. But, most importantly the project encompassed the interactivity of people and culture. Which is the main idea behind the project and also the class, to learn how to make something that helps with interactivity.

Thank you Eszter and Marcela for all the help 🙂