Final Project: Pirate Chase 2.0 – Audrey Samuel – Professor Rudi

For our Final Project we decided to improve upon Pirate Chase (our midterm project) and create a new and improved version of our game! Pirate Chase 2.0 is a boat racing game which requires two or more players to blow on their boats to reach the Treasure Chest. Players are allowed to blow on the other participants boats and must avoid letting their boats sink. In the Final Project, we included a few moving obstacles to make the game even harder. We gave our game a Pirate theme because we wanted to see how competitive the participants would get when trying to capture the treasure. We got our inspiration for this from the Economic Theory of the Tragedy of the Commons and wanted to analyze how far individuals would go to push their opponents away from reaching the treasure. In thinking about how our participants would interact with our project, we decided to use a round baby pool (60cmx30cm) as our “ocean” instead of the original rectangular shaped box, as we thought it would be more interactive if individuals could move freely around the circle, without being constrained by the four corners of a box. We also incorporated Processing into our game, by setting up a countdown as well as graphics to display when players should start the game and when a player wins the game. We also downloaded the Sound and Minim library to help us play the Pirates of the Caribbean theme song.

   

We included two obstacles, namely wave-making machines that produced waves in the water making it harder for individuals to blow on their boats directly towards the treasure. We also painted our boats blue, red, green and yellow to improve the design of our previous boats and make it more clear to users as to which boat belonged to which participant. We 3D printed wider boats as people had been complaining earlier that the boats sunk really fast. We also still used the infrared sensor to detective when the boats reached the treasure chest. We thought of using the color sensor to detect which boat won and customize the screen to show the winner but unfortunately we were not able to implement this fully.

Painted Boats

The production process involved a lot of work but we had a fun time throughout the entire process. We 3D printed four new boats and painted them in different colors. We also laser cut two boxes to support our “bridge” on which we would place our obstacles. Initially we had planned to place the treasure chest and infrared sensor on the bridge as well but we found that the pool was too small and the game would end sooner than we expected. We therefore fixed the infrared sensor and treasure chest separately on one end of the pool and the obstacles in the middle of the pool to make it harder for participants to reach the treasure. During the user-testing session, we were told that the users could not see the screen and play the game at the same time because we had placed the laptop screen on the side. We therefore had the screen lifted up and placed it right behind the treasure chest so that the participants could clearly see when the timer had gone off and when they had won the treasure. We were also told to hide the arduino and breadboard as it was pretty distracting. One thing we had to keep in mind was the obstacles falling into the water. Using water and electronics together was quite scary but we were able to pull it off in the end. At the end of the day, our ultimate aim was to see how competitive the users would get over non-existent treasure, and it worked just as we had assumed. People became more competitive when they were told there was an end goal to meet. Applying this to the Tragedy of Commons theory, individuals will be more likely to compete and drive their opponents away in an attempt to achieve limited resources, and in our case this was the treasure chest.

In conclusion, the main aim of our project was to showcase the “Tragedy of the Commons” concept (which was first introduced by American ecologist Garrett Hardin) through a fun and interactive game. Through the game we hoped to see how participants would react in a situation where there exists limited resources in a specific area, and in our case it was the treasure chest. Most importantly, we wanted our project to be as interactive as possible. Referring back to my definition of interaction, I stated that I had originally seen interaction as a form of communication, however I added to this definition stating that it is not only a form of communication, but also a way of blending technology and human abilities together in the most natural way without undervaluing the capabilities of humans, to therefore fulfill a greater aim. By allowing students to blow on the boats, I hoped to show how important human interaction was in using electronics. 

The participants who took part in the game loved the idea of blowing on the boats and said they had a lot of fun competing against their friends. If we had more time, we would get a bigger pool, so that all four boats could be used in the game instead of two. We would also include stable obstacles such as shark fins to make the game even harder. With respect to the Processing side of things, we would add  more audio/visuals to show who won, perhaps by including the sound of coins to represent treasure. We had to keep revising our project, making sure our boats floated as our first ones had not, and by merging both human interaction and computer interaction, our project in the end aimed to show these two things can work together in harmony. Bret Victor encourages us to not restrict interaction to the use of a single finger on a touch screen which is why we decided to incorporate “blowing” and physical movement in our project. This therefore allows individuals to truly feel as though they are not being undervalued, with the computer doing all the work, but rather sets an equal balance between man and machine. Ultimately, by the end of the class I have learnt what interaction truly means and how we can incorporate both human interaction and computer interaction into a project to help users learn something useful while also having fun. (Find below a video of our Project during the IMA End Of Semester Show).

Echo- Andrew Xie -Eric

CONCEPTION AND DESIGN:

When I first conceived the final project, my partner and I decided to make a project about decompression. Because we think that modern people live under great pressure and sub-health, we need to use a way to let the people living in the city release the pressure. But there are many ways to release pressure. We are thinking about how to give users an effective decompression experience in a short time. We envisioned letting users decompress by squeezing instant noodles, but it wasn’t attractive enough and environmentally friendly. So we decided to interact with our project with sound. The original idea is to take Christmas as the theme, let users blow the snow of Christmas tree through the sound, to form the concept of decompression. But it seems like a deliberate imposition of concepts. At last, we decided to make a pure interactive game. In the echo project, we have used the sound sensor and laser cut. These two simple originals can bring great benefits. The reason for giving up instant noodles is that we can’t find the sensor that matches it, and it will cause waste and can’t be reused. Violent squeezing and slapping are all potential violence, but voices can better overcome these problems. We also gave up an infrared sensor, because we want users to trigger the game at a certain distance, but in fact, it is unnecessary. In the end, I also used paper cutting, because, on user text, we found that although many people wanted to interact with our project, they were ashamed to scream in public, so I designed a phone shaped mask to help users solve this problem.

1

2

3

4

6

FABRICATION AND PRODUCTION:

In the whole project, we didn’t use laser cut to cover the computer because of the need of saving materials. Our success lies in the fact that we have successfully measured or defined the value of the sound sensor. We chose the microphone, 3 pin or 4 pin sound sensor, and finally decided the 4 pin sound sensor. Because only this is between sensitivity and dullness, we found that defining the value is a very challenging thing. Defining the sound as high and low, defining the value needs to be tested many times, because High value setting will make it difficult to reach the high standard, and reduce the low standard. Our game rule is to put pressure on users through five rounds of games, and then design a random screaming interface to let users instantly decompress.The biggest experience we gained from user text is that we should provide a mask for users to protect their privacy, because not everyone can scream in public. In order to avoid embarrassment and protect their privacy, I designed a phone shaped mask.Another useful information is that we should help users define the range of high and low in the initial interface, so that users can give feedback immediately after the game starts.As a result, users are often attracted by the phone mask I make and are willing to wear it. This improves our user experience and is very efficient. Another modification we have made is to add a free scrolling interface so that users can release their own pressure at will.But we also give up LED, because of some technical problems, our idea is to give users feedback through the change of LED color, red is the wrong range and green is the right one.7

2

CONCLUSIONS:

I don’t think my definition of interaction has changed much from the original one, but I have added some concepts, such as my definition of interaction is the interaction between people, people and machines, serving people and creating value. I think the final project has realized its value, interacted with the screen through sound, and really helped users release pressure in a short time. Some people doubt that this project is a little strange, but I think it’s inter lab. what we need to do is not to conform to some universal values, but to create something that can bring value, and create art through physical interaction.Finally, I let the user interact through the volume. If I have more time, I will design some refueling interfaces in the game interface that can make the user continue, because I found that few players can insist. Adding some decorations in the game interface and connecting the led into the program is what I will do if I have more time.What makes me gain a lot when I do the project is that the design of the project is not through a temporary interest but a long-term design. The concept is very important for a project. We need to make meaningful projects, with a group of special attention groups.I think it’s more important than the project to design concepts and care about social problems, or to imagine the future, rather than a meaningless thing. Prototyping is a very important thing. It’s the key factor in my opinion to constantly adjust through user testing and feedback through investigation. People first is the most important.Finally, let’s make a project, why should anyone care? I think designing a project for a certain group is the best way to help the group solve the problem, because every project we design is to solve the problem, express the value of care and transmission through the project, and interaction is an effective way to close the gap between different groups, just like our project, through Voice to attract more people to participate, pay attention to the problem of mental stress. Even if it can’t be solved, it is a way of art that expresses our value and people’s heart.

Recitation 9 – Connor S.

In controlling some form of media in Processing using a physical component from Arduino, I decided to work with a potentiometer to change the tint of an image. For this project, I thought it would be appropriate to alter the Arduino logo, which I was able to access easily via a link on their site. I wrote my Processing code to recognize values from the potentiometer sent to Arduino and change the tint of the logo from black to blue depending on the position of the dial:

Arduino Code:

void setup() {
Serial.begin(9600);
}

void loop() {
int sensor1 = analogRead(A0);
int theMappedValue = map(sensor1, 0, 1023, 0, 255);

// keep this format
Serial.write(theMappedValue);
//Serial.println(); // add linefeed after sending the last sensor value

// too fast communication might cause some latency in Processing
// this delay resolves the issue.
delay(100);
}

Processing Code:

import processing.serial.*;
PImage photo;

Serial myPort;
int valueFromArduino;

void setup() {
size(300, 250);
photo = loadImage(“http://www.arduino.cc/arduino_logo.png”);
printArray(Serial.list());
myPort = new Serial(this, Serial.list()[0], 9600);

}
void draw() {
while ( myPort.available() > 0) {
valueFromArduino = myPort.read();
}
printArray(Serial.list());
println(valueFromArduino);
image(photo, 0,0);
if (valueFromArduino <150){
tint(255,0,0);
} else if(valueFromArduino > 150){
tint (0, 255, 255);
}
}

Documentation:

From Computer Vision for Artist and Designers, I thought the “Messa di Voce interactive software” by Golan Levin and Zachary Lieberman (2003) was noteworthy because it presented an interaction involving physical movement in space and the use of one’s voice to effect images on a projection. I found it particularly interesting how the software used vision algorithms and tracks audio signals to produce a corresponding image. This installation exemplifies how computers can receive information from the physical world to produce a physical response. While changing the hue of an image with a potentiometer is nowhere near as complex, I found it interesting to experience first-hand how computer programs can react to physical influences to facilitate an interactive experience. 

Space Piglet off Balance—Cathy Wang—Marcela

Space Piglet off Balance—Cathy Wang—Marcela
After several discussions with our professor, we decide to make a simple game with a different game experience. Under the inspiration of somatosensory game, we plan to create a game which needs our whole body to control. We have thought of using a camera to capture our movements, but we feel it will be too similar to the somatosensory game. Eventually, we choose a balance board that is usually used for work out. We need to use our whole body especially the lower half part to control the board. Then our project becomes a combination of game and exercise—work out while playing.
In User Testing Session, our project was still in a primary stage. What we had were just a blue background, a moving brown rectangle and a piglet controlled by us. We got lots of praise about our idea, especially for using a balance board. Meanwhile, our game was too simple with few instructions. In another word, users are confused at first and easily get bored when they master the game. Also, we do not have vocal response nor scores/time counting. It seems like we have the tools of a game, but we have not made the game a true game. Another more theoretical problem is that what the game talks about. We can’t randomly choose a game’s element. All things appear for some reason. We also need to build a connection between the physical tool and the virtual image. So, we change the shape of the brown rectangle into an ellipse with the same image as our balance board to make an echo. We also build a scenario for this game: a piglet flying in the space needs to stay on his flying machine to keep safe and avoid aerolite. In this way, we have a reason why the piglet is moving and what’s the relationship between the balance board and the game. We also change the background of a universe and add music to it. Originally, we wanted to use a picture as a background. However, after doing that, our game became too slow to be played. Then we made one by ourselves.
In the final presentation, we got a different inspiration for our project from a classmate’s comment. He said people use the balance board to help little kids practice their body-coordination and help their leg and knee joint to grow better. But they may get bored and refuse to do it. Our project transforms this “boring” equipment into a fun game, which may have a huge practical meaning. In IMA Final Show, we find that our instructions are still not clear enough. Some users thought they were supposed to control the ellipse (with the same image to the balance board) instead of the piglet. Therefore, we may need to consider clarifying how the game works or adapting the corresponding relation.
We believe a fun game needs to create a different interactive experience from other games. By adding an element of work-out, we combine kinematic movement with the visual game. We believe interaction starts at a physical level. Users are more likely to interact with more physical participation. Although we still need to improve in many details, I believe we succeed in most parts according to users’ reactions. Our project gives the game a different way to play while making a work-out tool a fun game.


// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
import processing.sound.*;
SoundFile sound;
String myString = null;
Serial myPort;

int NUM_OF_VALUES = 3;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/
PImage image;
PImage image1;
PImage image2;
float x=800;
float y=300;
float speedX;
float speedY;
int state=0;
float a=800;
float b=300;
float xspeed=2;
float yspeed=1;
int score = 0;
float e=700;
float f=700;
float xspeeed = 10;
float yspeeed = 6;
float xspeeeed = 5;
float yspeeeed = 9;
float g=200;
float h=700;
int round;

long gameoverTime = 0;
long falloffTime = 0;
boolean isPigOnDisc = true;
boolean isPigRock=true;
PImage bgImage;


boolean start=false;


void setup() {
  fullScreen();
  //size(1200, 900);
  //size(1200, 800)
  // bgImage = loadImage("Uni.jpg");
  image = loadImage("piglet.png");
  image1 = loadImage("rockL.png");
  image2=  loadImage("rockS.png");
  setupSerial();
  sound= new SoundFile(this, "super-mario-bros-theme-song.mp3");
  sound.loop();
}


void draw() {
  updateSerial();

  round++;

  if (start==true) {
    background(0);
    fill(#F5DE2C);
    pushMatrix();
    translate(width*0.2, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 5, 15, 3); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.4, height*0.7);
    rotate(frameCount / 200.0);
    star(0, 0, 10, 15, 5); 
    popMatrix();
    pushMatrix();
    
     
    
    translate(width*0.7, height*0.4);
    rotate(frameCount / 200.0);
    star(0, 0, 10, 15, 5); 
    popMatrix();

  

    pushMatrix();
    translate(width*0.5, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();

    pushMatrix();
    translate(width*0.2, height*0.2);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();
    
     pushMatrix();
    translate(width*0.5, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();

    pushMatrix();
    translate(width*0.15, height*0.8);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();

    pushMatrix();
    translate(width*0.8, height*0.5);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.5, height*0.1);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();

    pushMatrix();
    translate(width*0.5, height*0.3);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();

    pushMatrix();
    translate(width*0.1, height*0.1);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.6, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.8, height*0.2);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();




  if (dist(x, y, a, b)<300) {
    isPigOnDisc = true;
  }
  // background(#98D2F7);
  // image(bgImage, 0, 0, width, height);
  if (millis() % 1000 < 10) {
    // check if the file is already playing
    score += 10;
  }

  textAlign(LEFT);
  textSize(30);
  //text("millis(): " + millis(), width-200, height/2-50);
  //text("millis() % 1000: " + millis()%1000, width-200, height/2+50);
  fill(255, 0, 0);
  text("Score: " + score, 50, 50);
  //println(millis());

  //background(#98D2F7);

  fill(#F2CE58);
  noStroke();
  ellipse(a, b, 500, 500);
  fill(#4D4C4C);
  noStroke();
  ellipse(a, b, 300, 300);
  fill(#F2CE58);
  noStroke();
  ellipse(a, b, 100, 100);

  a= a+xspeed;
  b= b+yspeed;
  if (round==500) {
    xspeed = 4;
    yspeed = 2;
  }

  if (a > width-250 || a <250) {
    xspeed = -xspeed;
  }
  if (b > height-250 || b <250) {
    yspeed = -yspeed;
  }

  image(image1, e, f, 135, 100);
  e= e+1.5*xspeeed;
  f= f+yspeeed;
  if (e > width || e <1 ) {
    xspeeed = -xspeeed;
  }
  if (f > height || f <0) {
    yspeeed = -yspeeed;
  }
  if ( dist(a, b, e, f)<300) {
    xspeeed = -xspeeed;
    yspeeed = -yspeeed;
  }

  image(image2, g, h, 135, 100);
  g= g+1*xspeeeed;
  h= h+yspeeeed;
  if (g > width || g <1 ) {
    xspeeeed = -xspeeeed;
  }
  if (h > height || h <0) {
    yspeeeed = -yspeeeed;
  }
  if ( dist(g, h, a, b)<300) {
    xspeeeed = -xspeeeed;
    yspeeeed = -yspeeeed;
  }

  //if ( f >= b -500 && f<=b+500) {
  //  yspeeed = -yspeeed;
  //}
  if (sensorValues[0]>-100 || sensorValues[0]<width+100) {
    speedX = map(sensorValues[0], 100, 400, -75, 75);
    x= x +1*speedX;
    x = constrain(x, 0, width-80);
    speedY = map(sensorValues[1], 100, 400, -75, 75);
    y = y-1.*speedY;
    y = constrain(y, 0, height-135);
  }
 }
image(image, x, y, 180, 200);



if (start==false) {
  fill(50);
  textSize(30);
  String s= "Stand on the balance board and press ENTER to start!                              Try to keep Piglet on the board and stay away from rocks!";
  
  fill(50);
  text(s, 500, 500, 1000, 100);
}
if (dist(x,y,e,f)<5) { //pig and rock
 background(0);
  fill(255);
  textSize(64);
  text("YOU DIED", width/2-100, height/2);
  score = 0;
    
 }
   
  
  //background(0);
  //fill(255);
  //textSize(64);
  //text("YOU DIED", width/2-100, height/2);
  //score = 0;
  //long gameoverTime = millis();
  //while (millis() - gameoverTime < 5000) {
  //  println("gameover");
  //  delay(10);
  //}
  //start = true;
  //x = width/2;
  //y = height/2;
  //println("Start Again!");
  ////delay(1000);
  ////start = false;
//}
if (dist(x, y, a, b)>350) { //when pig falls off the disc
  if (isPigOnDisc == true) {
    isPigOnDisc = false;
    falloffTime = millis();
  }
  if (millis() - falloffTime > 3000) {
    if (start == true) {
      gameoverTime = millis();
    }
    background(0);
    fill(255);
    textSize(64);
    text("YOU DIED", width/2-100, height/2);
    score = 0;
    start = false;

    if (millis() - gameoverTime > 5000) {

      start = true;
      x=500;
      y=500;
      a=500;
      b=500;
      e=700;
      f=700;
      g=200;
      h=700;
    }


    //println("Start Again!");
  }
}
}

//void gameOver() {
//  background(0);
//  fill(255);
//  textSize(64);
//  text("YOU DIED", width/2-100, height/2);
//  x=500; //pig
//  y=200; //pig
//  a=650; //disc
//  b=300; //disc
//  start = false;
//  score = 0;
//  e=100; //shark
//  f=100; //shark
//  delay(1000);
//}

void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

void keyPressed() {
  if (key == ENTER) {
    start = true;
  } else {
    start = false;
  }
}
void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Our Planet – You (Ian) Xu – Inmi Lee

CONCEPTION AND DESIGN:

  • Research on Social Awareness, Inclusiveness, and Interaction

To do this project, we researched the current issues of climate change. And human’s massive impact on it. We referred to the NASA website about the fact of climate change. They have already collected significant evidence to show that human is causing a severe climate change effect. Among them, carbon dioxide is a fundamental cause of it. It also addresses that de-desertification and forest are the ways to ease climate change for a while. Therefore, Vivien and I think that it is significant to address this issue in an interactive way, which could arouse the public’s awareness. As described in the documentation before, combining Vivien and my definition of “interaction” together, we believe that an interactive project with open inclusiveness and significance to the society is essential. The most significant research projects we referred to are bomb and teamLab.

  • Core Theme

Therefore, in this project, we want our users to get the information on the accelerating climate change pace. Guild them through a reflection on themselves: In what did I contribute the climate change? How possible could I pay an effort to eliminate it?

  • Game-like Art Project – ending without winning

Our project is a game-like art project that asks the user to plant trees. In return, they can compensate for their effect on climate change virtually. However, since it is a universal consensus that humans cannot stop causing climate change unless we stop all our lives and movements, meaning death, humans must keep on paying effort in the career of environmentalism. Once they stop, the climate situation will start to get worse again. To express this information to the uses and make them aware of it, we intentionally design the ending as a tragedy – explosion only, meaning that there’s no “winning” situation. Again, our project is NOT a general defined game but an interactive art project that engages the public in the conversation of preservation for climate change.

  • Collaborative instead of Competitive

We humans or all the creatures in the globe are one entity. Facing climate change, we do not compete for resources within groups. Therefore, it is instead, a process that requires all humans’ collaborative efforts. So, in our project, we also make it collaborative instead of competitive. At most four users can shovel the dirt to plant the same tree together at the time to accelerate the growing speed of the tree, which results in a more positive effect to reduce the increasing carbon dioxide. As more users join in, the trees will grow faster; the planet will last for a longer time.

  • Materials

We initially intended to apply four masks to detect (or fakely detect) the users’ breath. Use four shovels with sensors to detect their movement. And apply the computer screen to show the virtual tree graphs to the users. Below is a rough sketch of them.

sketch for project

  • Abandon mask

However, after presenting our project to the class, we collected feedback about the concerns with the masks. Even though applying masks sounds reasonable for our project, it may be not making sense to users. Also, since multiple users will experience our project, it is complex and not environmentally friendly to change the mask every time. Wearing masks may also affect the users’ comfortable experience. Considering all these factors, we abandoned the use of masks and replaced it with a straightforward instruction on the screen that “human breath consume O2 and produce CO2.”

  • Authentic experience with dirt

We tried to experience shovel with the tree growing animation. However, due to two reasons, we decide to have some solid materials to shovel instead of only ask the users to present the movement of shoveling. First, since the user’s behaviors and movements are not predictable, we could not think out of the way to use specific sensors to count the times a user shovel accurately. Second, it is very dull and stupid to shovel nothing. By applying dirt, we can detect the movement of the shovel by detecting the changing weight of soil; users will also have a more authentic experience of planting the tree.

  • GUI

To make the users have a sense of their progress by “planting trees,” we designed a GUI for users to keep track of everything. We apply multiple signals to alert users in the process of their interaction. Two bars regarding the level of oxygen and carbon dioxide, changing the background color, growing trees, simple texts, notification sound, and explosion animation. Each of these components is designed to orient the user into the project as soon as possible.

FABRICATION AND PRODUCTION:

  • Programming and Weight sensor

By referring to our course materials and online references, we did not meet with any unsolved programming issues. We applied OOP, functions, media, etc. into our program. This is the first version of our program.

To detect the movement of “planting trees,” we decided to use a weight sensor. However, when I first got it from the equipment office, I had no idea how it should work.

weight sensor

Then I looked up the official website of the producer about this sensor. It explains all the information about this sensor in detail. The only problem I met is that the sensor I got is broken. I successfully repaired it by myself. As instructed, it needs to work with a weight platform. The sketch is below.

platform

However, the website that details the platform is no longer available. We first intended to design one by ourselves and 3D print it out. However, without detail information, it is too hard to design it within days. We then consulted the lab assistant and received the suggestion to hang this sensor in the air instead of putting it on the ground. Therefore, in the final presentation, we took advantage of the table to fix the sensor hanging on it.

  • User testing: user-friendly redesign

During the user testing session, we received much feedback regarding some problems with our project that makes it less friendly to the users. First, the program is not stable. Sometimes, when the box is just swinging in the air, the program will sense it as the user has put sand/dirt into the box. This misinformation may mislead the user to shovel sand out of the box instead of into the box. So, we changed some algorithm in the program to only sense larger-scale change of weight and only allowed the tree to grow a little each time regardless of how much weight is added to the box. Also, we planned to add a fake tree in the middle of the box to signal the user to put dirt into the big box. Second, the use did not get timely feedback once they shovel the sand into the box. We then add a notification sound to notify the user. Also, we mirrored our small computer screen to a bigger monitor and changed the size of some graphs in the GUI later in the presentation to make sure the user can keep track of their processes easily. Also, since we are putting the sand into different containers, some users may be misinformed by it since it looks like a competition. However, we intend to design it as a cooperating job, not a competition. So, we carefully relocate the position of the containers and shovels to make it at least look collaborative. Last but not least, some users also think it hard to build a logical connection with sand and planting trees. To avoid confusion, we changed sand to dirt so that the correlation between them should be straightforward enough. After this process, I also learned how significant it is to have the user test our project to collect their feedbacks to avoid my fixed mindset. Therefore, we can fix some design flaws to make it more accessible and friendly to users.

  • Fabrication

Our fabrication process includes drilling holes on the box, 3D printing a tree model, and build the whole project into the revenue in advance. Vivien devoted much to this process!

3d printsensor bondingrevenue setting

Final project presentation:

user interaction

CONCLUSIONS:

The most important goal of our project is to arouse the public’s awareness and reflection on a more significant issue of climate change. We want it to be an interactive project that addresses social problems; it is fun to interact with; it is inclusive to many users, both those directly interacting with it and others overserving it. These criteria also correspond to our understanding and definition of “interaction.” First, I think it is indeed a fun and inclusive experience for the users who interact with it. They are paying effort to shovel the dirt and get constant feedback from the program. For others who are observing the interaction, it is also fun to view the whole process, which will also leave an impressive impression on them. The only pity is that the layout of the containers, dirt, shovels, and the monitor may not be very perfect for making sure everyone has a comfortable place to view and move. And the users indeed interacted with our project in the way that we design in the last presentation. Regarding addressing social awareness, we have some successes and some failures. According to our discussion after the demonstration, they successfully get the basic information about the human’s severe impact on climate change issues and the possible actions humans can do to address it. However, we have a debate on the ending of the project. There is only one option of the ending: O2 drops to 0 and the planet explodes. Some argue that it gives out a negative message that whatever you do, it will fail in the end unless four users keep on shoveling without stop. We fully understand this pessimistic thinking pattern. However, regarding the real-word situation, we still think this reflects how severe the climate change issue is and it indeed requires humans’ continuous efforts on it or else it will cause ecological disasters. I believe this is an issue that is worth a full seminar time to discuss. We are open to any interpretation of the design of “no winning, always failure” since that is not a flaw, but part of our special design of the project. Regardless of this problem, we think there are some improvements we can make in the future according to some other feedbacks in the presentation. First, as we are running out of the dirt, we can make the box as an hourglass so that the soil could be recyclable to use. Second, we can redesign the layout of the project, use a larger projection, and have all the dirt directly on the ground if we have a bigger revenue so that all the users and observers will have a better experience. Third, if necessary, we can add a winning situation that could be hardly achieved (still need more discussion, as I state before). I also learned a lot from the process of designing and building this project both technically and theoretically. I gain many skills in programming, problem-solving, fabrication, crafting, etc. I learned how to make a project better fit for the audience by testing and hearing form their feedback. This experience also sheds my understanding of interaction deeper in its meaning that it could be so flexible that it involves many characteristics. By address the climate change issue, I also have a reflection of myself, of my understanding of the issue, and it engages me to think further and discover it.

Climate change is happening. I believe our project significantly addresses this issue in an innovative way of interaction. It is an art, meaning that the audience may have various understandings of their own. However, the presentation of the issue and their authentic experience interaction with our project makes our core theme impressive to them. Big or small, I believe we are making an impact.

Code: link to GitHub

Code: link to Google Drive

Works Cited for documentation:

“The Causes of Climate Change.” NASA. https://climate.nasa.gov/causes/

“Climate Change: How Do We Know?” NASA. https://climate.nasa.gov/evidence/

“Weight Sensor Module SKU SEN0160,” DFRobot, https://wiki.dfrobot.com/Weight_Sensor_Module_SKU_SEN0160

“Borderless World.” teamLab. https://borderless.team-lab.cn/shanghai/en/

Works Cited for programming:

KenStock. “Isolated Tree On White Background.” pngtree. https://pngtree.com/freepng/isolated-tree-on-white-background_3584976.html

NicoSTAR8. “Explosion Inlay Explode Fire.” pixaboy. https://pixabay.com/videos/explosion-inlay-explode-fire-16640/

“success sounds (18).” soundsnap. https://www.soundsnap.com/tags/success

“explosion.” storyblocks. https://www.audioblocks.com/stock-audio/strong-explosion-blast-rg0bzhnhuphk0wxs3l0.html