Fnl Prjct: Reflection by ChangZhen from Inmi’s Session

Project Title: Won Color el Stage

Descriptions of the project can be found in my last article. No big deviation from the initial configuration. https://wp.nyu.edu/shanghai-ima-documentation/foundations/interaction-lab/cx592/fnl-prjct-essay-by-changzhen-from-inmis-session/

1. The Strategy

Three players must cooperate as importantly as compete to win the game. After all, mixed color indicates it’s at least two colors.

If only one player inputs too much, the score would go to one of her opponent. E.g. If it’s solely red who inputs, mixed color will be judged purple instead of orange that’s counted to be her score.

If they three input too much, the score decreases for all. The larger input for each, the worse decrease.

2. Digital Design

The video parts will be displayed on 1920*1080 (16:9) TV screen. The video image is made of the MikuMikuDance video in the center, three primary colors on the left with each party’s score, mixed color on the right, and the match on the top.

3. Physical Design

The circuit is hidden in a triangular prism. There are holes laser cut on each side face to fix an ultrasonic distance sensor and an LED indicating her input color. The sensor senses the distance on each side 0~30cm, and the brightness of the LED changes in response. The prism is made of shiny reflexible materials, because it’s designed to be a stage. Around each angle of its top is pasted a small paper with the anime idol printed on.

To my surprise, the ultrasonic distance sensor requires a library and is connected to digital pin on arduino. This sketch is symmetrical for each player.

4. Objections and Suggestions to This Project

Marcella firstly suggests I get rid of the video and let the event just happen on the physical stage. The idol who holds stage will be identified by color stage lights. Secondly, she puts forward the reason that if three players watch the screen in the same direction, the triangular prism design loses its meaning. Thirdly, the input method may be better if it’s to sense the movements of the player dancing. As for the first, I regret not to have taken this advise, because I had limited time to make adjustment. As for the second, I think VR glasses instead of a TV on one side will also make players play face to face forming a triangle. Her third idea is awesome.

Christina Bowllan and her friend objected to the video content. By negligence, I didn’t take feminists like them into account. If I had known that who knew little about anime, comics, and games will not accept the artistic appearance of such anime female characters dancing, I would change the content like I didn’t adopt the idea of mixing color to draw flags, which is political. I might choose landscape videos or so.

Professors suggested I reveal each party’s score so that players know it’s a match and knows where they’re more clearly. I adjusted that on the IMA festival day. And I dismantle the paper cover tent above the stage by their suggestion to make the stage easier on the eyes.

5. Conclusions

Marcella and other professors’ suggestions remind me that each component of the project shall have a focused purpose to account for what the project is about and how it’s played.

The feminists’ objections remind me that I shall consider when and where a project is allowed. It’s intended for all in the context of this inter lab class, so I shall make it acceptable to all.

The core of this project is its strategy. It suggests the real life situations. Victory doesn’t come to a single person; it comes to groups that corporate and compete. So the strategy can be applied to more than this idol dancing thing to create other colorful intellectual projects.

Arduino Code

//include distance sensor and hook up to digital pin 2, 4, 7

#include “Ultrasonic.h”
Ultrasonic s1(2);
Ultrasonic s2(4);
Ultrasonic s3(7);

void setup() {
Serial.begin(9600);
}

void loop() {

//read the distance in cm

int d1 = s1.MeasureInCentimeters();
int d2 = s2.MeasureInCentimeters();
int d3 = s3.MeasureInCentimeters();

//restrict the range within 30cm

if(d1 > 30) {
d1 = 30;
}
if(d2 > 30) {
d2 = 30;
}
if(d3 > 30) {
d3 = 30;
}

//control the brightness of color LED according to the distance

analogWrite(9,255-d1*255/30);
analogWrite(10,255-d2*255/30);
analogWrite(11,255-d3*255/30);

//serial communicate to processing

Serial.print(d1);
Serial.print(“,”);
Serial.print(d2);
Serial.print(“,”);
Serial.print(d3);
Serial.println();
}

Processing Code

import processing.serial.*;
import processing.video.*;
import processing.sound.*;

//create instance; movie is the video of the dancing idol of that color; BGM is background music

Movie orange;
Movie green;
Movie purple;
SoundFile BGM;

String myString = null;
Serial myPort;

//3, because there are three datas for each idol in one serial line

int NUM_OF_VALUES = 3;
int[] sensorValues;

//mixed color

color c;

//r, y, b are the incremental score counted by millis(); O, G, P are the accumulative scores of each player

long red = 0;
long yel = 0;
long blu = 0;
long Ora = 0;
long Gre = 0;
long Pur = 0;

// play BGM only once

boolean play = true;

void setup() {
//TV screen size

size(1920, 1080);
noStroke();
background(0);
setupSerial();

//get the files

orange = new Movie(this, “orange.mp4”);
green = new Movie(this, “green.mp4”);
purple = new Movie(this, “purple.mp4”);
BGM = new SoundFile(this, “BGM.mp3”);
textSize(128);
fill(255);
text(“mix”, 1655, 450);
}

void draw() {
updateSerial();
printArray(sensorValues);
//m1, 2, 3 are multipliers, the input amount, derived from distance 

float m1 = 1-float(sensorValues[0])/30;
float m2 = 1-float(sensorValues[1])/30;
float m3 = 1-float(sensorValues[2])/30;

//play BGM only once

if (play == true) {
BGM.play();
play = false;
}

//display each’s color and the mixed on screen as rectangles

fill(255, 255-255*m1, 255-245*m1);
rect(0, 0, 288, 360);
fill(255, 255-10*m2, 255-255*m2);
rect(0, 360, 288, 360);
fill(255-255*m3, 255-105*m3, 255);
rect(0, 720, 288, 360);
c = color(255-255*m3, 255-(255*m1+10*m2+105*m3)*255/370, 255-(245*m1+255*m2)*255/500);
fill(c);
rect(1632, 540, 288, 360);

//judge the mixed color and who earns score at the moment; whose accumulative score is highest, whose idol’s video shall be played

if (m1+m2+m3 < 2.4) {
if (m1+m2 > 3*m3 && m2 > 0.1) {
red = millis()-Ora-Gre-Pur;
Ora += red;
}
if (m2+m3 > 3*m1 && m3 > 0.1) {
yel = millis()-Ora-Gre-Pur;
Gre += yel;
}
if (m3+m1 > 3*m2 && m1 > 0.1) {
blu = millis()-Ora-Gre-Pur;
Pur += blu;
}
}
if (m1+m2+m3 > 2.4) {
Ora -= 100*(m1)/(m1+m2+m3);
Gre -= 100*(m2)/(m1+m2+m3);
Pur -= 100*(m3)/(m1+m2+m3);
orange.stop();
green.stop();
purple.stop();

}
println(Ora);
println(Gre);
println(Pur);

textSize(128);
if (Ora > Gre && Ora > Pur) {
if (orange.available()) {
orange.read();
}
fill(0);
rect(288, 0, 1632, 324);
fill(255);
text(“Orange Girl on Stage”, 350, 200);
orange.play();
green.stop();
purple.stop();
image(orange, 288, 324, 1344, 756);
}
if (Gre > Ora && Gre > Pur) {
if (green.available()) {
green.read();
}
fill(0);
rect(288, 0, 1632, 324);
fill(255);
text(“Green Girl on Stage”, 350, 200);
green.play();
orange.stop();
purple.stop();
image(green, 288, 324, 1344, 756);
}
if (Pur > Gre && Pur > Ora) {
if (purple.available()) {
purple.read();
}
fill(0);
rect(288, 0, 1632, 324);
fill(255);
text(“Purple Girl on Stage”, 350, 200);
purple.play();
orange.stop();
green.stop();
image(purple, 288, 324, 1344, 756);
}

//end of game and show who wins

if(millis() >= BGM.duration()*1000) {
background(0);
fill(255);
if (Ora > Gre && Ora > Pur) {
text(“Orange Girl Wins Stage!”, 240, 500);
}
if (Gre > Ora && Gre > Pur) {
text(“Green Girl Wins Stage!”, 240, 500);
}
if (Pur > Ora && Pur > Gre) {
text(“Purple Girl Wins Stage!”, 240, 500);
}
}
fill(0);
textSize(32);
text(int(Ora), 20, 180);
text(int(Gre), 20, 540);
text(in(Pur), 20, 900);
}

void setupSerial() {
printArray(Serial.list());
myPort = new Serial(this, Serial.list()[3], 9600);

myPort.clear();
myString = myPort.readStringUntil(10);
myString = null;

sensorValues = new int[NUM_OF_VALUES];
}

void updateSerial() {
while (myPort.available() > 0) {
myString = myPort.readStringUntil(10);
if (myString != null) {
String[] serialInArray = split(trim(myString), “,”);
if (serialInArray.length == NUM_OF_VALUES) {
for (int i=0; i<serialInArray.length; i++) {
sensorValues[i] = int(serialInArray[i]);
}
}
}
}
}

Final project individual reflection (Katie)

PROJECT TITLE – YOUR NAME – YOUR INSTRUCTOR’S NAME

Forest Box: build your own forest–Katie–Inmi

CONCEPTION AND DESIGN:

In terms of interaction experience, our idea is to create something similar to VR: users do physical interactions in the real world and resulted in changes on the screen. There are several options we explored in the designing process. Inspired by a work by Zheng Bo called 72 relations with the golden rod,  at first, we want to use an actual plant that attach multiple sensors onto it. To let the users explore different relations they can do to plants. For example, we want to attach a air pressure sensor onto the plant and whenever the someone blow to it, the video shown on the computer screen will change. And a pressure sensor that someone can step on it. 

But we ended up not choosing these options because the equipment room do not have most of the sensors we need. We then select the color sensor, touch sensor, distance sensor and the motion sensor. However, we did not think carefully about the physical interactions before hooking them up. The first problem is that the motion sensor does not work as the way we want: it only sense motion but cannot identify certain gestures. As a result it sometimes will conflict with the distance sensor, so we give it up. So we have a very awkward stage where we have three sensors hooked up and different videos running according to sensor values but have difficulty to link them together.

After asking Inmi for advice, the final solution we think of is to make a forest box that users can interact with and the visuals on the screen will change according to different kinds of interactions. If you place fire into the box, a forest fire video will be shown on the screen. if you throw plastics into the box, a video of plastic waste will be shown on the screen. if you pull off the trees, a video of forest being cut down will be shown on the screen. By this kind of interaction, we want to convey the message that every individuals’ small harm to earth can create huge damages. For the first scene, we use a camera with effects to attract user’s attention.

 

FABRICATION AND PRODUCTION:

The most significant steps are first: hook up the sensors, Arduino and Processing code and let them communicate. I think the most challenging part for me is to figure out the logic in my Processing code. I did not know how to start at first because there are too many conditions and results. I don’t know how the if statements are arranged to achieve the output I want. The very helpful thing is to draw a flow diagram of how each video will be played.

and then we need to define a new variable called state and the initial value is 1.

Then, the logic becomes clear and the work become simpler. I just need to write down what each state does separately and then connect them together. Although the code within one state can be very lengthy and difficult, the overall structure is simple and clear to me.

For example from state 1 to state 2:

Another important thing is we want to switch from state 1 to two by the keypress interaction. However, the video of state 2 only plays when the key is pressed, when you release the key, the state will turn back to 1. To solve this problem, I create a Boolean to trigger the on and off of the video.

At first, we want different users to run from far to near to the screen, wearing different costumes representing plastics and plants. the user wearing which costume first approach the screen determines which video to play. However, during user testing, our users said first, the costume is of poor quality. Second, the process of running is not interactive enough. Our professor also said that there’s no link between what the users are physically doing: running, to what’s happening on the screen: playing educational videos.

So after thinking through this problem, we create a forest box to represent forest, and one can interact with different elements of the box.

In this way, what the user is doing physically has some connections with what is shown on the screen.

 

CONCLUSIONS:

The goal of our project is to raise people’s awareness about climate issues, and reflect on our daily actions. The project results align with my definition of interaction in the way that output of the screen is determined by the input (users’ physical interaction). It is not align with my definition in the way that there is no “thinking” process between the first and the second input. We’ve already give the options they can do, so there’s no much exploration. I think our audience interact with our project the way we designed.

But there is a lot of things we can improve. First, we can better design the physical interaction with the forest box and let user to design the box the way they the want. For example, we can fill the box with dust and provide different kinds of plants and other decorations.  Different users can experience the act of planting a forest together. By placing multiple color sensors in different place on the box, the visuals will change according to the different trees being planted.  Second, we could first let plastics fill the surface of the box to indicate the plastic waste nowadays. If the user get rid of the plastics, the visuals on the screens will change, too.  Second, for the video shown on the screen, we can draw by ourselves.

The most important thing I’ve learned is experience design. As I reflect on my project designing process, I realize for me now, it’s better to first think of an experience rather than the theme of the project. Sometimes to begin with a very big and broad theme is difficult for me to design the experience. But if you first think of an experience, for example, a game, then it’s easier to adapt the experience to your theme.

The second thing I’ve learned is coding skills. With more can more experience in coding, I think my coding logic improves. For this project, we have many conditions to determine which video to play, so there are a lot of “if statement”. I felt a mess at first, did not know how to start, then Tristan asked me to jump out of the “coding” for a second and think about the logic by drawing a flow diagram. After doing this, I felt much clearer of what I was going to do.

I think the climate issue is certainly a very serious one and everybody should care. Because  climate changes effect our daily lives. “Nature does not need humans. Humans need nature.”

Strike!- Zhao Yang (Joseph)- Inmi

PROJECT TITLE

Strike!- Zhao Yang (Joseph)- Inmi

CONCEPTION AND DESIGN

For the final project, my partner Barry and I decided to make an entertaining game. Basically, we chose to model an existing aircraft game. Since it’s classic and interesting, we don’t need to spend more time to introduce the mechanics of the game to the users so that we can spend more time focusing on how to accomplish a better game. For our project, we not only keep the original mechanics of the aircraft game but also add some other changes to the game. For the original game, the user needs to control the aircraft to attack the enemies and tries to avoid crashing into the enemies so that the aircraft can stay longer and get a higher score. If the aircraft doesn’t crash into the enemies or it destroys the enemies, it won’t lose their health points. However, we make some changes to this mechanics. In our game, if you let an enemy flee, your health points will also decrease. So the game encourages the user to try their best to attack the enemies. The reason why we make this change is to ensure that the game would end at some point and to increase the difficulty of the game. It is one of the creative parts of our final project. On the other hand, the way to interact with the original aircraft game is too limited. In the past, the only way to interact with the aircraft game is to click the buttons and use the joystick. In this way, the users only can interact with their fingers and hands. In that case, the users’ sense of engagement is not strong. Thus, in order to improve the aircraft game, we make changes to the ways of interaction. Based on our preparatory research, we found a project that uses Arduino and Processing. That project “tried to mimic the Virtual reality stuffs that happen in the movie, like we can simply wave our hand in front of the computer and move the pointer to the desired location and perform some tasks”. Here is the link of that project.

https://circuitdigest.com/microcontroller-projects/virtual-reality-using-arduino

In that project, the user can wear a device on his hand so that the computer could detect the motion of his hand. Then he can execute commands on the computer by moving and manipulating his hand. In my opinion, the interactive experience of this project can give people more sense of engagement. Hence, I think we can make a way to interact with our project to engage the user’s whole body. Then we came up with the idea that people can open their arms and imagine they are the aircraft itself and then tilt their bodies to control the movement of the aircraft. This is the way we expect people to interact with our project. Therefore, we chose to use the accelerometer sensor as the input of our project. This sensor can detect the acceleration on three axes. We chose the acceleration on Y-axis as the input in particular. If the user can wear the device on their wrist and wave their arm up and down, the acceleration on Y-axis will change. Then we can map this acceleration to Processing to control the movement of the aircraft. In this sense, it gives the user a sense that they are really flying. And this way to interact can reinforce the user’s sense of engagement. Moreover, the accelerometer sensor is quite small. So we can make a wearing device based on it very easily. And it’s convenient for users to wear it and take it off so that the users can enjoy the game more quickly. These are the reasons why we chose the sensor and why the sensor best suited to our project purpose. Honestly speaking, Kinect might have been another option. And it aligns with our idea that we expect people can engage their whole body to interact with our project. It’s actually our first choice of the input medium. However, it was not allowed to use the Kinect for the final project. So we have to reject this option.

FABRICATION AND PRODUCTION

Since we tried to model an arcade game, we decided to laser cut an arcade game case in our production process. By using the case to cover my computer, it can provide the user with the sense of playing an arcade game instead of playing a computer game. This case provides a better outlook for our project. On the other hand, another significant step in our production process is to make the accelerometer sensor as a wearing device. In this case, it would be more sensitive if the user wears it instead of just holding it in their hands. And here are the images of our design of the arcade game case.

During the user test, most of the feedback was positive. Only a few of the users suggested that we should add the functions to make the aircraft can move forward so that the game could be more interesting. Furthermore, at that time, we only had one sensor. We found a problem that by only wearing one sensor, the user could only move their right arm to control the aircraft. And it was a little confusing for the users to open their arms and tilt their bodies to control the movement of the aircraft. Even though they read the instruction, some of them were still confused about the way of interaction. If we didn’t explain to them, most of them couldn’t properly interact with our project. And these are several of the videos during the user testing process.

As a result, after the user test, we decided to add another sensor to control the aircraft to move forward. Besides, we changed the one-player mode and two-player mode to easy mode and expert mode. However, it still aligns with our original thought about two players. Because of the addition of another sensor to control the aircraft to move forward, the game becomes more difficult. It’s really hard for the user to use their right arm to control the aircraft to move left and right and flip their left hand to control the aircraft to move forward and backward at the same time. If you don’t want to challenge the expert mode alone, you can ask your friend to collaborate to control the movement of the aircraft. From my perspective, the adaption of adding another sensor is effective and successful. During the IMA End of Semester Show, by wearing two sensors, even though the users chose to play the easy mode, which can only move the aircraft left and right, it made more sense to them to open their arms and tilt their bodies to control the movement of the aircraft. And the users can easily follow our instructions without being confused. I think our production choices are pretty successful. They align with our original thought that the users can use their whole body to interact with our project. Moreover, the way of using the sensors makes the game more interesting and gives the user more sense of engagement.

CONCLUSIONS

In conclusion, the goal of our project is to make an entertaining and interesting game so that the users can have fun with it and spend their spare time playing it to relax. My definition of interaction is that it is a cyclic process that requires at least two objects which affect each other. In my opinion, our project quite aligns with my definition of interaction. The motion of the user’s body controls the movement of the aircraft in the game. Meanwhile, the scores and the image of the game immediately show to him. It aligns with the part that the objects affect each other. Moreover, the user has to focus on the game and keep interacting with it so that he can get a higher score. It aligns with the part of the cyclic process. In this sense, our project aligns with my definition of interaction successfully. Basically, all the people who have played our game align with our expectations of how they should interact. Just sometimes some of the users didn’t read the instructions that we provided to them, so they might be confused about how to interact. If we had more time, we could come up with more innovative ideas on the mechanics of the game. Since the mechanics of our game is quite similar to the original game, the user might not find much novelty on our game. If it is allowed, we would like to try to use the Kinect as the way of interaction because the direct detection of our motion can make the connection between the way of interaction and the game itself more clear. After all, our idea of engaging the whole body to interact fits the use of Kinect more than the use of the accelerometer sensor. I’ve learned a lot from our accomplishments in our final project. For instance, we have to test the game by ourselves, again and again, to ensure that the user can experience the best version of the game. We had to spend a lot of time debugging. Besides, in order to laser cut the arcade game case, I learned how to use illustrator. What’s more, the most important thing when creating an interactive project is that the user is always the first thing that we need to consider.

Code

https://github.com/JooooosephY/Interaction-Leb-Final-Project

“Save Me From Plastic-Lifan Yu-Inmi Lee

 Brief introduction:

This game focuses on the environmental problem of ocean plastic pollution. We designed an interactive game that encourages people to use less plastic product and find ways to treat plastic wastes that do less harm to our environment (and ourselves)–stop dumping them into natural environment! 

 CONCEPTION AND DESIGN:

In general, when I think of how my users are to interact with my project, I would like them to move around, be actively engaged and achieve the winning result with combined effort.

First, to move around. When we thought of how users can interact with images of falling plastic trash pieces on the screen, we first thought about pushing buttons. But this can’t get users to move around. So, we decided that half of the screen should have a camera-captured real-time image as a background. This way users’ image actually appears on the screen along with images of plastic trash. Users thus can move around, changing their places in the image and interact with image of falling pieces of plastic that’s laid on top of the camera image. To determine whether, in users’ image, they successfully touched the plastic pieces, a threshold is set. When falling images move pass a place whose colors are darker than the color this threshold represents, these images disappear. Thus the results looks like when users wear dark-colored gloves and their images on the screen “touches” the falling trash with those gloves, it counts as they have “successfully blocked the plastic pieces from falling into the sea”. (Dark-colored clothes can also work) This is effective to encourage users to move around.

Second, to be actively engaged. This game is a continuous, quick-paced game. Plastic trash pieces fall from the sky continuously and the already existing pieces of plastic in the ocean move around all the time. Users have to be focused on the game in order no to let the fish die. At first, we did not add the already existing plastic pieces images in the ocean. However, this is problematic because if one player is very good at “catching” all the falling plastic pieces, the other player don’t need to control the fish at all. Then we added extra plastic pieces in the sea to encourage two users to actively play this game from the beginning.

Third, to win this game with combined effort. We once thought of making a one-player game. If more plastic pieces fall into the sea, the color of sea water will change and there will appear dead fish in the water. When the dead fish reaches a certain amount, game over. However, we thought of offering players another experience in the fish’s viewpoint. Thus we added the fish part.

We wish to raise environmental awareness by providing our users with information about plastic pollution in ocean. There was an option that we rejected was a candy box that automatically opens when players win. We once thought of putting on the candy wrappers some sentences of call to actions and scientific facts about current environmental conditions. Therefore, users are more willing to take in all those information. However, our motor didn’t work well in our device and we just didn’t add it to our project.

FABRICATION AND PRODUCTION:

In user testing, we received suggestions to add a video of ocean in the background of the “fish” part. We also received suggestions that we should provide users with some gloves with colors that rarely appear on clothes. In our code, we can write some lines that detect the color of these gloves so that as soon as they appear on at the same place with falling plastic pieces,  the plastic pieces will disappear. Back then, we didn’t add the already-existing plastic pieces in the ocean, so all players just focused on “blocking” plastic pieces. Nobody went to control the fish’s movements.

We later added a video of ocean in the background. However, this didn’t work very well because our video showed fish swimming. During presentation, some users reported that they get confused about which fish they are controlling—fish in the background video or the hand-drawn fish.

We bought bright pink gloves and black gloves, but we ultimately chose black gloves to provide to users. However, this turned out to be a wrong choice because one of our users wore black clothes and can very easily “catch” the falling plastic pieces images by stretching out her hands.

CONCLUSIONS:

We tried to use a fun and interactive game to raise environmental awareness among our users. My definition of interaction is that people and device can receive information from and provide feedback to each other. This process is better if this communication can go on and on if both sides keep interacting. My project allows users to receive information by seeing the falling plastic pieces and try to block them from falling into the sea. When one piece of plastic is blocked, other plastic pieces will still keep falling down. People can keep interacting until the game ends. Referring to the game itself, a camera is used to detect users’ movement information and decide whether their images on the screen actually “touched” the plastic pieces or not. According to different actions the users make–successfully caught the trash pieces or accidentally let them fall into the ocean—the game itself also gives different feedbacks. When a plastic piece fall into the sea, it will start moving randomly in the “sea area” and threaten the fish. If users don’t control the fish and hide away from plastic pieces, it will die. Different actions done by users can lead to the game’s winning or game ending. The whole process is an active, interactive process.

If we had more time, we would change our “keypress” into pushbuttons. We can place several neat little buttons on laser-cut boards in front of users with captions like “start” “restart” etc.

We can also use a camera that connects to the computer and hide the computer away. This way, we can change the distance of the camera and the user instead of placing a computer in front of users (which made their image appear to be very large and they can too easily “catch” the plastic pieces.) After this, our whole project can look more simple and neat.

We can also add some sound effects so that as soon as users “catch” a piece of plastic trash, a sound can be played.

Finally, the page that contain scientific data and call to action can be shown to users before the game starts instead of after the game ends, because users are usually too busy restarting the game instead of stopping and taking a look at those words.

I’ve learned that if I want to raise awareness, I should focus more on how to make everyone feel like “I really should care”. Our project simply stated the current disastrous situation of ocean animals, but our game didn’t show how this affects us and wasn’t touching enough to encourage people to care. Moreover, it would be better if we made the whole thing look more pretty that people will be very interested in interacting with it as soon as they see it.

I was glad that some people really liked the “blocking the falling plastic pieces” part. They thought it was interesting. One person even did research into plastic pollution after user testing and decided to buy less drinks contained in plastic bottles and use less plastic products. I wish if I have another chance in the future, I can refine the whole game and make it look more attractive, so that we can let more people know about this serious environmental issue in a fun way. Taking actions to slow down our environment worsening can’t wait another day. However, if I directly call for everyone to take actions, no one will be willing. This project we made can be of some help to interest people in this often-ignored environmental problem that actually decide whether we and ocean animals live or die.

PART OF OUR CODE

(the part when fish is alive.

code that decide the winning and losing of this game)  

//PART OF OUR CODE
//(the part when fish is alive & code that decide the winning and losing of this game)   

 if (ok==true) {




      for (int k=0; k<plasticList.size(); k++) {
        Plastic temp=plasticList.get(k);
        temp.display();
        temp.move();
      }





      if (fishY<=height/2+10) {
        fishY=height/2+10;
      }

      if (fishY>=height-40) {
        fishY=height-40;
      }

      if (fishX<=30) {
        fishX=30;
      }

      if (fishX>=width-30) {
        fishX=width-30;
      }

      if(sensorValues[0]==1){
      fishY-=fishSpeed;
      }

      if(sensorValues[1]==1){
      fishY+= fishSpeed;
      }

      if(sensorValues[2]==1){
      fishX-= fishSpeed;
      }

       if(sensorValues[3]==1){
      fishX+= fishSpeed;
      }


      image (img1, fishX, fishY, 100, 100);
    }

    for (int j=0; j<plasticList.size(); j++)//When fish touches plastic, it dies
    {
      Plastic nmsl=plasticList.get(j);
      float dis = sqrt((fishX-nmsl.x)*(fishX-nmsl.x)+(fishY-nmsl.y)*(fishY-nmsl.y))-nmsl.size*0.4;
      if (dis<=0) {
        ok = false;
      }
    }


    if (ok==false ) {//if a fish dies, game over, users lose the game
      image(img6, width/2, height/2, width, height);
   //   println("show game over");
    }


    if (key == 'c' && ok == false ) {//see the facts page
      image(img8, width/2, height/2, width, height);
     //  println("show facts");
    }


    if ( key=='s' && ok==false ) {//restart after losing the game
      ok=true;
      win = 0;
      start = 0;
      int l = plasticList.size();
      for (int i = l-1; i>=0; i--) {
        plasticList.remove(i);
      }

      int r = fallingplastics.size();
      for (int s = r-1; s>=0; s--) {
        fallingplastics.remove(s);
      }
    
    }

    if (win == 1 ) {//this means when users win, the fish is alive whatsoever
      //myPort.write('1');
      ok = true;
      fill(255, 34, 899);
      rect(0, 0, width, height);
      textSize(40);
      fill(376, 678, 222);
      text("You Won!!Press 's' to restart", width/3, height*2/3);
      textSize(45);
      fill(255);
      text("You won the game.", width/2,height/5);
      textSize(30);
      text("But there are countless fish that survive by chance like this", width/2, height/4);
      text("We play a crucial role in deciding their lives and deaths",width/2,height/3);
      text("Less Plastic, less disaster", width/2, height/2);
    }

    if (fallingplastics.size()>=fallingplasticsWinning && ok==true) {//After a certain amount of time, users win
      win = 1;
      fill(255, 34, 899);
      rect(0, 0, width, height);
      textSize(40);
      fill(376, 678, 222);
      text("You Won!!Press 's' to restart", width/3, height/2);
      fill(255);
      text("Save our ocean!", width/2, height/4);
      text("Less Plastic, less disaster", width/2, height/3);
    }

    if (keyPressed && key=='s' && win==1) {//restart the game after winning
      ok = true;
      win = 0;
      start = 0;
      int l = plasticList.size();
      for (int i = l-1; i>=0; i--) {
        plasticList.remove(i);
      }

      int r = raindrops.size();
      for (int s = r-1; s>=0; s--) {
        raindrops.remove(s);
      }
    }
  }
}


class Plastic {
 
 
  float x,y;
  float size;
  float speedX;
  float speedY;
  Plastic(float startingX, float startingY){
    x=startingX;
    y=startingY;
    size=random(50,150);

    speedX=random(-10,10);
    speedY=random(1,15);
  }

    void display(){
      image(img2,x,y,size,size);
  }
 

  void move(){
    x+=speedX;
    y+=speedY;
    if(x<=0||x>=width){
      speedX=-speedX;
    }
   
    if(y<=height/2||y>=height){
      speedY=-speedY;
    }
   
  }
 
}

Final Project–The epic of us–Ketong Chen–Inmi

  For this final project, I teamed with Tya to build our project”The epic of us”. I learned a lot during the process of making the project and I want to thank my professor and my best partner Tya and all assistants who give me a lot of help.

CONCEPTION AND DESIGN:

  Since our project is a board game that involves two players to role-play the leader of two civilizations, we want them to try to attack each other for their development of the country. We first thought about using the fsr sensor to create a device to let the user to hit and the force the user put on will determine the damage of the other civilization. To use the fsr sensor, we need to figure out how to store the maximum value that the fsr senses during a certain period of time. But later we find it hard to control the sensor(the same problem we have met during the midterm project). And we also had two buttons for the players to begin the game, so we decide to use the button to do all interactions, the times the button be pressed would decide the damage degree. And to make the game more engaging, we put 48 LEDs on our board which nearly drove us crazy. We first try to use a chip called 74HC595 to control 8 LEDs with 3 pins from the Arduino(since we need 48LEDs and there are not enough pins), but after several days of struggling, it did not work. Though really discouraged, we still want LEDs on our board to show the steps of the players. Finally, we use Arduino Mega which has 54 pins in total to connect LEDs and Adruino. When connecting the LEDs, the Mega did not work for some reason, after asking the fellow, I learned that I should not connect the pin 0 and pin 1 to LEDs since they are used for Mega to talk with the computer. Also, the LEDs are not stable and the wires always fall off so we need to frequently check them and fix them. And about the materials of the board, we first used cupboard but we were not satisfied with it because it was a little bit soft and did not delicate enough. We used wood at last and to make the board larger, we used two wood boards and stuck them together.

FABRICATION AND PRODUCTION:

  We laser cut many pictures on our board to decorate and show the process of the development of human societies. It is a very annoying job to turn all the pictures into the forms that the machine can recognize. I asked the fellow for help and the link here is very useful. But due to the different versions of the illustrator and photoshop, I first had difficulties delete the white color in the pictures. But later I figured it out. During the user testing, because we failed to figure put the use of the HC74595 chip so we did not use LEDs to show the steps of the players, we had to use two real characters and let the users to move the characters by themselves. Then the problems came out that the game board is separate from the computer screen so when users moved their characters they had to turn away to looked the board which caused them to miss some instructions on the screen. So we were determined to add the LEDs and combined the screen and the board together to create a better game. So we later laser cut a bigger board with holes on it to out LEDs. And also during the process of watching the users play the game, we found that the speed of changing the instructions is too high for someone who played the game for the first time so we later make some adjustments to make it more readable. 

CONCLUSIONS:

  According to my definition of interaction before —— a cyclic process that requires at least two objects (both animate and inanimate are accepted) individually has its input-analyze-output, and the whole process should be meaningful. Our project aims to let people be aware of a better way to develop is to collaborate rather than fight with each other. If people fight with each other for the resources they want they will go astray together. Since it is a board game it has a cyclic process and also meaning behind it which aligns with my definition of interaction. When people interact with our project, they did not hesitate to attack each other and they were surprised to see both of two civilizations be destroyed in the end. But later someone said that we did not give the player a clear instruction that they can choose not to attack. We want to let people choose from attack or not to attack, but it turned out that they did not have the intention to choose. That’s the point we need to further think of and improve. Usually, people will not do what you expect them to do and that’s why there is always room for improvement. If we have more time, we will make it clear for people that they have the choice to not to attack other civilizations. From my perspective, I am happy that we conveyed our idea to people. Hope our project has brought fun and deep thoughts. 

The picture for our project: