In controlling some form of media in Processing using a physical component from Arduino, I decided to work with a potentiometer to change the tint of an image. For this project, I thought it would be appropriate to alter the Arduino logo, which I was able to access easily via a link on their site. I wrote my Processing code to recognize values from the potentiometer sent to Arduino and change the tint of the logo from black to blue depending on the position of the dial:
Arduino Code:
void setup() {
Serial.begin(9600);
}
void loop() {
int sensor1 = analogRead(A0);
int theMappedValue = map(sensor1, 0, 1023, 0, 255);
// keep this format
Serial.write(theMappedValue);
//Serial.println(); // add linefeed after sending the last sensor value
// too fast communication might cause some latency in Processing
// this delay resolves the issue.
delay(100);
}
From Computer Vision for Artist and Designers, I thought the “Messa di Voce interactive software” by Golan Levin and Zachary Lieberman (2003) was noteworthy because it presented an interaction involving physical movement in space and the use of one’s voice to effect images on a projection. I found it particularly interesting how the software used vision algorithms and tracks audio signals to produce a corresponding image. This installation exemplifies how computers can receive information from the physical world to produce a physical response. While changing the hue of an image with a potentiometer is nowhere near as complex, I found it interesting to experience first-hand how computer programs can react to physical influences to facilitate an interactive experience.
Space Piglet off Balance—Cathy Wang—Marcela
After several discussions with our professor, we decide to make a simple game with a different game experience. Under the inspiration of somatosensory game, we plan to create a game which needs our whole body to control. We have thought of using a camera to capture our movements, but we feel it will be too similar to the somatosensory game. Eventually, we choose a balance board that is usually used for work out. We need to use our whole body especially the lower half part to control the board. Then our project becomes a combination of game and exercise—work out while playing.
In User Testing Session, our project was still in a primary stage. What we had were just a blue background, a moving brown rectangle and a piglet controlled by us. We got lots of praise about our idea, especially for using a balance board. Meanwhile, our game was too simple with few instructions. In another word, users are confused at first and easily get bored when they master the game. Also, we do not have vocal response nor scores/time counting. It seems like we have the tools of a game, but we have not made the game a true game. Another more theoretical problem is that what the game talks about. We can’t randomly choose a game’s element. All things appear for some reason. We also need to build a connection between the physical tool and the virtual image. So, we change the shape of the brown rectangle into an ellipse with the same image as our balance board to make an echo. We also build a scenario for this game: a piglet flying in the space needs to stay on his flying machine to keep safe and avoid aerolite. In this way, we have a reason why the piglet is moving and what’s the relationship between the balance board and the game. We also change the background of a universe and add music to it. Originally, we wanted to use a picture as a background. However, after doing that, our game became too slow to be played. Then we made one by ourselves.
In the final presentation, we got a different inspiration for our project from a classmate’s comment. He said people use the balance board to help little kids practice their body-coordination and help their leg and knee joint to grow better. But they may get bored and refuse to do it. Our project transforms this “boring” equipment into a fun game, which may have a huge practical meaning. In IMA Final Show, we find that our instructions are still not clear enough. Some users thought they were supposed to control the ellipse (with the same image to the balance board) instead of the piglet. Therefore, we may need to consider clarifying how the game works or adapting the corresponding relation.
We believe a fun game needs to create a different interactive experience from other games. By adding an element of work-out, we combine kinematic movement with the visual game. We believe interaction starts at a physical level. Users are more likely to interact with more physical participation. Although we still need to improve in many details, I believe we succeed in most parts according to users’ reactions. Our project gives the game a different way to play while making a work-out tool a fun game.
// IMA NYU Shanghai// Interaction Lab// For receiving multiple values from Arduino to Processing/* * Based on the readStringUntil() example by Tom Igoe * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html */import processing.serial.*;
import processing.sound.*;
SoundFile sound;
String myString = null;
Serial myPort;
int NUM_OF_VALUES = 3; /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/int[] sensorValues; /** this array stores values from Arduino **/PImage image;
PImage image1;
PImage image2;
float x=800;
float y=300;
float speedX;
float speedY;
int state=0;
float a=800;
float b=300;
float xspeed=2;
float yspeed=1;
int score = 0;
float e=700;
float f=700;
float xspeeed = 10;
float yspeeed = 6;
float xspeeeed = 5;
float yspeeeed = 9;
float g=200;
float h=700;
int round;
long gameoverTime = 0;
long falloffTime = 0;
boolean isPigOnDisc = true;
boolean isPigRock=true;
PImage bgImage;
boolean start=false;
voidsetup() {
fullScreen();
//size(1200, 900);//size(1200, 800)// bgImage = loadImage("Uni.jpg");
image = loadImage("piglet.png");
image1 = loadImage("rockL.png");
image2= loadImage("rockS.png");
setupSerial();
sound= new SoundFile(this, "super-mario-bros-theme-song.mp3");
sound.loop();
}
voiddraw() {
updateSerial();
round++;
if (start==true) {
background(0);
fill(#F5DE2C);
pushMatrix();
translate(width*0.2, height*0.5);
rotate(frameCount / 200.0);
star(0, 0, 5, 15, 3);
popMatrix();
pushMatrix();
translate(width*0.4, height*0.7);
rotate(frameCount / 200.0);
star(0, 0, 10, 15, 5);
popMatrix();
pushMatrix();
translate(width*0.7, height*0.4);
rotate(frameCount / 200.0);
star(0, 0, 10, 15, 5);
popMatrix();
pushMatrix();
translate(width*0.5, height*0.5);
rotate(frameCount / 200.0);
star(0, 0, 3, 9, 3);
popMatrix();
pushMatrix();
translate(width*0.2, height*0.2);
rotate(frameCount / 200.0);
star(0, 0, 3, 9, 3);
popMatrix();
pushMatrix();
translate(width*0.5, height*0.5);
rotate(frameCount / 200.0);
star(0, 0, 3, 9, 3);
popMatrix();
pushMatrix();
translate(width*0.15, height*0.8);
rotate(frameCount / 200.0);
star(0, 0, 3, 9, 3);
popMatrix();
pushMatrix();
translate(width*0.8, height*0.5);
rotate(frameCount / -100.0);
star(0, 0, 5, 15, 5);
popMatrix();
pushMatrix();
translate(width*0.5, height*0.1);
rotate(frameCount / -100.0);
star(0, 0, 5, 15, 5);
popMatrix();
pushMatrix();
translate(width*0.5, height*0.3);
rotate(frameCount / -100.0);
star(0, 0, 5, 15, 5);
popMatrix();
pushMatrix();
translate(width*0.1, height*0.1);
rotate(frameCount / -100.0);
star(0, 0, 5, 15, 5);
popMatrix();
pushMatrix();
translate(width*0.6, height*0.5);
rotate(frameCount / 200.0);
star(0, 0, 3, 9, 3);
popMatrix();
pushMatrix();
translate(width*0.8, height*0.2);
rotate(frameCount / 200.0);
star(0, 0, 3, 9, 3);
popMatrix();
if (dist(x, y, a, b)<300) {
isPigOnDisc = true;
}
// background(#98D2F7);// image(bgImage, 0, 0, width, height);if (millis() % 1000 < 10) {
// check if the file is already playing
score += 10;
}
textAlign(LEFT);
textSize(30);
//text("millis(): " + millis(), width-200, height/2-50);//text("millis() % 1000: " + millis()%1000, width-200, height/2+50);fill(255, 0, 0);
text("Score: " + score, 50, 50);
//println(millis());//background(#98D2F7);fill(#F2CE58);
noStroke();
ellipse(a, b, 500, 500);
fill(#4D4C4C);
noStroke();
ellipse(a, b, 300, 300);
fill(#F2CE58);
noStroke();
ellipse(a, b, 100, 100);
a= a+xspeed;
b= b+yspeed;
if (round==500) {
xspeed = 4;
yspeed = 2;
}
if (a > width-250 || a <250) {
xspeed = -xspeed;
}
if (b > height-250 || b <250) {
yspeed = -yspeed;
}
image(image1, e, f, 135, 100);
e= e+1.5*xspeeed;
f= f+yspeeed;
if (e > width || e <1 ) {
xspeeed = -xspeeed;
}
if (f > height || f <0) {
yspeeed = -yspeeed;
}
if ( dist(a, b, e, f)<300) {
xspeeed = -xspeeed;
yspeeed = -yspeeed;
}
image(image2, g, h, 135, 100);
g= g+1*xspeeeed;
h= h+yspeeeed;
if (g > width || g <1 ) {
xspeeeed = -xspeeeed;
}
if (h > height || h <0) {
yspeeeed = -yspeeeed;
}
if ( dist(g, h, a, b)<300) {
xspeeeed = -xspeeeed;
yspeeeed = -yspeeeed;
}
//if ( f >= b -500 && f<=b+500) {// yspeeed = -yspeeed;//}if (sensorValues[0]>-100 || sensorValues[0]<width+100) {
speedX = map(sensorValues[0], 100, 400, -75, 75);
x= x +1*speedX;
x = constrain(x, 0, width-80);
speedY = map(sensorValues[1], 100, 400, -75, 75);
y = y-1.*speedY;
y = constrain(y, 0, height-135);
}
}
image(image, x, y, 180, 200);
if (start==false) {
fill(50);
textSize(30);
String s= "Stand on the balance board and press ENTER to start! Try to keep Piglet on the board and stay away from rocks!";
fill(50);
text(s, 500, 500, 1000, 100);
}
if (dist(x,y,e,f)<5) { //pig and rockbackground(0);
fill(255);
textSize(64);
text("YOU DIED", width/2-100, height/2);
score = 0;
}
//background(0);//fill(255);//textSize(64);//text("YOU DIED", width/2-100, height/2);//score = 0;//long gameoverTime = millis();//while (millis() - gameoverTime < 5000) {// println("gameover");// delay(10);//}//start = true;//x = width/2;//y = height/2;//println("Start Again!");////delay(1000);////start = false;//}if (dist(x, y, a, b)>350) { //when pig falls off the discif (isPigOnDisc == true) {
isPigOnDisc = false;
falloffTime = millis();
}
if (millis() - falloffTime > 3000) {
if (start == true) {
gameoverTime = millis();
}
background(0);
fill(255);
textSize(64);
text("YOU DIED", width/2-100, height/2);
score = 0;
start = false;
if (millis() - gameoverTime > 5000) {
start = true;
x=500;
y=500;
a=500;
b=500;
e=700;
f=700;
g=200;
h=700;
}
//println("Start Again!");
}
}
}
//void gameOver() {// background(0);// fill(255);// textSize(64);// text("YOU DIED", width/2-100, height/2);// x=500; //pig// y=200; //pig// a=650; //disc// b=300; //disc// start = false;// score = 0;// e=100; //shark// f=100; //shark// delay(1000);//}void star(float x, float y, float radius1, float radius2, int npoints) {
float angle = TWO_PI / npoints;
float halfAngle = angle/2.0;
beginShape();
for (float a = 0; a < TWO_PI; a += angle) {
float sx = x + cos(a) * radius2;
float sy = y + sin(a) * radius2;
vertex(sx, sy);
sx = x + cos(a+halfAngle) * radius1;
sy = y + sin(a+halfAngle) * radius1;
vertex(sx, sy);
}
endShape(CLOSE);
}
voidkeyPressed() {
if (key == ENTER) {
start = true;
} else {
start = false;
}
}
void setupSerial() {
printArray(Serial.list());
myPort = new Serial(this, Serial.list()[0], 9600);
// WARNING!// You will definitely get an error here.// Change the PORT_INDEX to 0 and try running it again.// And then, check the list of the ports,// find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" // and replace PORT_INDEX above with the index number of the port.
myPort.clear();
// Throw out the first reading,// in case we started reading in the middle of a string from the sender.
myString = myPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII
myString = null;
sensorValues = newint[NUM_OF_VALUES];
}
void updateSerial() {
while (myPort.available() > 0) {
myString = myPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCIIif (myString != null) {
String[] serialInArray = split(trim(myString), ",");
if (serialInArray.length == NUM_OF_VALUES) {
for (int i=0; i<serialInArray.length; i++) {
sensorValues[i] = int(serialInArray[i]);
}
}
}
}
}
Research on Social Awareness, Inclusiveness, and Interaction
To do this project, we researched the current issues of climate change. And human’s massive impact on it. We referred to the NASA website about the fact of climate change. They have already collected significant evidence to show that human is causing a severe climate change effect. Among them, carbon dioxide is a fundamental cause of it. It also addresses that de-desertification and forest are the ways to ease climate change for a while. Therefore, Vivien and I think that it is significant to address this issue in an interactive way, which could arouse the public’s awareness. As described in the documentation before, combining Vivien and my definition of “interaction” together, we believe that an interactive project with open inclusiveness and significance to the society is essential. The most significant research projects we referred to are bomb and teamLab.
Core Theme
Therefore, in this project, we want our users to get the information on the accelerating climate change pace. Guild them through a reflection on themselves: In what did I contribute the climate change? How possible could I pay an effort to eliminate it?
Game-like Art Project – ending without winning
Our project is a game-like art project that asks the user to plant trees. In return, they can compensate for their effect on climate change virtually. However, since it is a universal consensus that humans cannot stop causing climate change unless we stop all our lives and movements, meaning death, humans must keep on paying effort in the career of environmentalism. Once they stop, the climate situation will start to get worse again. To express this information to the uses and make them aware of it, we intentionally design the ending as a tragedy – explosion only, meaning that there’s no “winning” situation. Again, our project is NOT a general defined game but an interactive art project that engages the public in the conversation of preservation for climate change.
Collaborative instead of Competitive
We humans or all the creatures in the globe are one entity. Facing climate change, we do not compete for resources within groups. Therefore, it is instead, a process that requires all humans’ collaborative efforts. So, in our project, we also make it collaborative instead of competitive. At most four users can shovel the dirt to plant the same tree together at the time to accelerate the growing speed of the tree, which results in a more positive effect to reduce the increasing carbon dioxide. As more users join in, the trees will grow faster; the planet will last for a longer time.
Materials
We initially intended to apply four masks to detect (or fakely detect) the users’ breath. Use four shovels with sensors to detect their movement. And apply the computer screen to show the virtual tree graphs to the users. Below is a rough sketch of them.
Abandon mask
However, after presenting our project to the class, we collected feedback about the concerns with the masks. Even though applying masks sounds reasonable for our project, it may be not making sense to users. Also, since multiple users will experience our project, it is complex and not environmentally friendly to change the mask every time. Wearing masks may also affect the users’ comfortable experience. Considering all these factors, we abandoned the use of masks and replaced it with a straightforward instruction on the screen that “human breath consume O2 and produce CO2.”
Authentic experience with dirt
We tried to experience shovel with the tree growing animation. However, due to two reasons, we decide to have some solid materials to shovel instead of only ask the users to present the movement of shoveling. First, since the user’s behaviors and movements are not predictable, we could not think out of the way to use specific sensors to count the times a user shovel accurately. Second, it is very dull and stupid to shovel nothing. By applying dirt, we can detect the movement of the shovel by detecting the changing weight of soil; users will also have a more authentic experience of planting the tree.
GUI
To make the users have a sense of their progress by “planting trees,” we designed a GUI for users to keep track of everything. We apply multiple signals to alert users in the process of their interaction. Two bars regarding the level of oxygen and carbon dioxide, changing the background color, growing trees, simple texts, notification sound, and explosion animation. Each of these components is designed to orient the user into the project as soon as possible.
FABRICATION AND PRODUCTION:
Programming and Weight sensor
By referring to our course materials and online references, we did not meet with any unsolved programming issues. We applied OOP, functions, media, etc. into our program. This is the first version of our program.
To detect the movement of “planting trees,” we decided to use a weight sensor. However, when I first got it from the equipment office, I had no idea how it should work.
Then I looked up the official website of the producer about this sensor. It explains all the information about this sensor in detail. The only problem I met is that the sensor I got is broken. I successfully repaired it by myself. As instructed, it needs to work with a weight platform. The sketch is below.
However, the website that details the platform is no longer available. We first intended to design one by ourselves and 3D print it out. However, without detail information, it is too hard to design it within days. We then consulted the lab assistant and received the suggestion to hang this sensor in the air instead of putting it on the ground. Therefore, in the final presentation, we took advantage of the table to fix the sensor hanging on it.
User testing: user-friendly redesign
During the user testing session, we received much feedback regarding some problems with our project that makes it less friendly to the users. First, the program is not stable. Sometimes, when the box is just swinging in the air, the program will sense it as the user has put sand/dirt into the box. This misinformation may mislead the user to shovel sand out of the box instead of into the box. So, we changed some algorithm in the program to only sense larger-scale change of weight and only allowed the tree to grow a little each time regardless of how much weight is added to the box. Also, we planned to add a fake tree in the middle of the box to signal the user to put dirt into the big box. Second, the use did not get timely feedback once they shovel the sand into the box. We then add a notification sound to notify the user. Also, we mirrored our small computer screen to a bigger monitor and changed the size of some graphs in the GUI later in the presentation to make sure the user can keep track of their processes easily. Also, since we are putting the sand into different containers, some users may be misinformed by it since it looks like a competition. However, we intend to design it as a cooperating job, not a competition. So, we carefully relocate the position of the containers and shovels to make it at least look collaborative. Last but not least, some users also think it hard to build a logical connection with sand and planting trees. To avoid confusion, we changed sand to dirt so that the correlation between them should be straightforward enough. After this process, I also learned how significant it is to have the user test our project to collect their feedbacks to avoid my fixed mindset. Therefore, we can fix some design flaws to make it more accessible and friendly to users.
Fabrication
Our fabrication process includes drilling holes on the box, 3D printing a tree model, and build the whole project into the revenue in advance. Vivien devoted much to this process!
Final project presentation:
CONCLUSIONS:
The most important goal of our project is to arouse the public’s awareness and reflection on a more significant issue of climate change. We want it to be an interactive project that addresses social problems; it is fun to interact with; it is inclusive to many users, both those directly interacting with it and others overserving it. These criteria also correspond to our understanding and definition of “interaction.” First, I think it is indeed a fun and inclusive experience for the users who interact with it. They are paying effort to shovel the dirt and get constant feedback from the program. For others who are observing the interaction, it is also fun to view the whole process, which will also leave an impressive impression on them. The only pity is that the layout of the containers, dirt, shovels, and the monitor may not be very perfect for making sure everyone has a comfortable place to view and move. And the users indeed interacted with our project in the way that we design in the last presentation. Regarding addressing social awareness, we have some successes and some failures. According to our discussion after the demonstration, they successfully get the basic information about the human’s severe impact on climate change issues and the possible actions humans can do to address it. However, we have a debate on the ending of the project. There is only one option of the ending: O2 drops to 0 and the planet explodes. Some argue that it gives out a negative message that whatever you do, it will fail in the end unless four users keep on shoveling without stop. We fully understand this pessimistic thinking pattern. However, regarding the real-word situation, we still think this reflects how severe the climate change issue is and it indeed requires humans’ continuous efforts on it or else it will cause ecological disasters. I believe this is an issue that is worth a full seminar time to discuss. We are open to any interpretation of the design of “no winning, always failure” since that is not a flaw, but part of our special design of the project. Regardless of this problem, we think there are some improvements we can make in the future according to some other feedbacks in the presentation. First, as we are running out of the dirt, we can make the box as an hourglass so that the soil could be recyclable to use. Second, we can redesign the layout of the project, use a larger projection, and have all the dirt directly on the ground if we have a bigger revenue so that all the users and observers will have a better experience. Third, if necessary, we can add a winning situation that could be hardly achieved (still need more discussion, as I state before). I also learned a lot from the process of designing and building this project both technically and theoretically. I gain many skills in programming, problem-solving, fabrication, crafting, etc. I learned how to make a project better fit for the audience by testing and hearing form their feedback. This experience also sheds my understanding of interaction deeper in its meaning that it could be so flexible that it involves many characteristics. By address the climate change issue, I also have a reflection of myself, of my understanding of the issue, and it engages me to think further and discover it.
Climate change is happening. I believe our project significantly addresses this issue in an innovative way of interaction. It is an art, meaning that the audience may have various understandings of their own. However, the presentation of the issue and their authentic experience interaction with our project makes our core theme impressive to them. Big or small, I believe we are making an impact.
We wanted to create an interactive project where users would get to see the reflections of themselves through a light matrix, instead of seeing their reflections on something obvious like a camera or mirror. Since we wanted to make it very obvious what the users were doing, we decided to put the matrix (our output source) right in front of the camera (our input source). In the beginning we were just going to have them side by side but we realized that would take the attention off of the matrix since people tend to want to look at their reflection more than the light, no matter how obvious the light may be.
During our brainstorm period, we tried to research different light boards since we knew we wanted to have a surface of light displays instead of single LED lights. We also though that programming and using 64 or more LED lights would be very complicated. We ended up using the Rainbowduino and 8×8 LED Matrix Super Bright to create the light board as well as the Rainbowduino being able to be our Arduino/breadboard source. Since we researched a bunch of different light sources, the only one that was available to us was the Rainbowduino and LED matrix. I’m sure there would have been better options, especially because we hoped to have a bigger LED board.
FABRICATION AND PRODUCTION:
During user testing we got different feedback and suggestions from our classmates that they thought would make our project better. A lot of classmates wished that the display of the LED board was bigger so that we could make our interaction more of an experience rather than just a small board in front of their eyes. As an attempt to change that, we wanted to put together multiple LED boards to make a bigger screen. Soon after attempting that, we realized that you actually can’t put together multiple LED boards easily and make them work together. Each LED board actually operates separately from each other. As stated in our concept and design, we researched multiple different types of LED boards but a lot of the materials that were better suited for our project were not available to us in the short amount of time after user testing.
After realizing that we still had to use one LED Matrix board, we decided to make our fabrication product so that it would magnify the LED board. We decided to make a polygonal shape with the clear acrylic material where the LED would fit snug into a box on the end. We decided to go with the clear acrylic board for laser cutting is because we thought the opaque look would suit our design better than any other material. In my mind, I pictured that the LED lights would reflect off of different surfaces and it would appear more interesting if the product was see-through. We really didn’t think there would have been a better option because all of the other laser cutting materials were too dull and 3D printing wouldn’t have worked for a hollow design. After fabrication, a fellow ( which I forgot his name…..) gave us the idea to actually put the matrix INSIDE our polygon so that the lights would reflect within our polygon. This truly changed our project because now, we were able to utilize our fabrication design in a different and better way than before.
Sketching for fabrication:
Laser Printing:
Original fabrication:
Changed fabrication so the light would shine through:
Another suggestion during user testing that we had was that users wished the LED board was facing them instead of facing up because it was hard to see the board when it is not facing the user. Therefore, we decided to make the fabrication product a polygon so that it would be easier to angle the polygon to turn to the side and face the user.
Lastly, we had a great suggestion to implement sound into our project to make it more interesting rather than just seeing light, users would also be able to trigger different sounds while they move. After getting this feedback, we decided to code different sounds into our project that would trigger when you moved in different places. This really changed our project because we got to use different sounds and lights to create art, which in my opinion made our project more well rounded.
Sketches for LED board and pixels on camera:
After presenting our final project, we got some feedback saying that some of the sounds were too much and it would be better if we used all musical instruments instead of animal noises, snapshots, etc. Since we both really wanted to present our product at the IMA show, we decided to change the sounds to all instrument sounds before the show, that way it would be lighter on the ears and the users would be less confused. I think this helped our project a lot because many people really loved our project at the IMA show, even Chancellor Yu got to interact and see our project!
Chancellor Yu interacting with our project!!!!:
CONCLUSIONS:
The goal of my project was to create something that users could interact with but at the same time, have fun with. We wanted to create something that has a direct input/output as well as something users can play with for fun. As a creator, I felt like it was really cool to create something that people can interact with and have fun with at the same time.
My product result aligns with my original definition of interaction because it has both an input and an output but it will keep running whether or not it has an input. The input being the camera, will still detect changes in motion whether or not something or some one is moving. At the same time, my definition of interaction stated that interaction is a continuous loop from input to output. So if there is an input, there will 100% be an output. Which in my project, if there is any change in motion, it will change the light on the matrix and trigger the sound at the same time.
My expectation of the audience’s response was pretty similar. The only thing my partner and I didn’t really think about was that once a user sees their own reflection, they tend to focus on that instead of the lights changing. I often found myself having to explain my project instead of them figuring it out and if they did figure it out, it took a bit of time for them to do so. Other than that, my expectation of audience reactions was pretty similar.
User Reactions:
If I had more time to improve my project, I would definitely take into consideration the “experience” aspect of the project I wanted to implement. During our final project, Eric said that if really wanted to make it an experience, we needed to factor in a lot of different things. If I could change some of the things about my project to make it more of an experience I would have speakers around to amplify the sound, project the camera input onto a bigger screen, and make the LED light board bigger.
From the setbacks and failures of my project, I learned that theres always room for improvement, even if you think there isn’t enough time. I learned that there is always going to be projects and parts of other peoples work that will be better than yours but you should never compare what other peoples capabilities are to yours. After taking this class and seeing all of the work that I have done, I am very happy with all of my accomplishments. I would have never thought that this project would come to life during our brainstorming period but I’m really glad we could make it work! I’m really glad that we were able to create a fun and interactive work or art where users were able to see themselves and make art with light as well as music and sound!
int countC = 0;
for (int y = circleSize; y < 150; y+=circleSize){
for (int x = 200+circleSize; x < 400; x+=circleSize) {
int i = x + y*w;
int up = x + (y- circleSize) * w;
int down = x + (y+ circleSize) * w;
int left = (x – circleSize) + y*w;
int right = (x + circleSize) + y*w;
int upLt = (x – circleSize) + (y- circleSize) * w;
int upRt = (x + circleSize) + (y- circleSize) * w;
int downLt = (x – circleSize) + (y+ circleSize) * w;
int downRt = (x + circleSize) + (y+ circleSize) * w;
int countP = 0;
for (int y = 450+circleSize; y < 600; y+=circleSize){
for (int x = circleSize; x < 200; x+=circleSize) {
int i = x + y*w;
int up = x + (y- circleSize) * w;
//int down = x + (y+ circleSize) * w;
int left = (x – circleSize) + y*w;
int right = (x + circleSize) + y*w;
int upLt = (x – circleSize) + (y- circleSize) * w;
int upRt = (x + circleSize) + (y- circleSize) * w;
if(p[i] && p[up] && p[left] && p[right] && p[upRt] && p[upLt]){
//fill(cam.pixels[i]);
countP++;
}
}
int countO = 0;
for (int y = 450+circleSize; y < 600; y+=circleSize){
for (int x = 200+circleSize; x < 400; x+=circleSize) {
int i = x + y*w;
int up = x + (y- circleSize) * w;
//int down = x + (y+ circleSize) * w;
int left = (x – circleSize) + y*w;
int right = (x + circleSize) + y*w;
int upLt = (x – circleSize) + (y- circleSize) * w;
int upRt = (x + circleSize) + (y- circleSize) * w;
if(p[i] && p[up] && p[left] && p[right] && p[upRt] && p[upLt]){
//fill(cam.pixels[i]);
countO++;
}
}
int countN = 0;
for (int y = 450+circleSize; y < 600; y+=circleSize){
for (int x = 400+circleSize; x < 600; x+=circleSize) {
int i = x + y*w;
int up = x + (y- circleSize) * w;
//int down = x + (y+ circleSize) * w;
int left = (x – circleSize) + y*w;
int right = (x + circleSize) + y*w;
int upLt = (x – circleSize) + (y- circleSize) * w;
int upRt = (x + circleSize) + (y- circleSize) * w;
//fill( 0 );
if(p[i] && p[up] && p[left] && p[right] && p[upRt] && p[upLt]){
//fill(cam.pixels[i]);
countN++;
}
}
When my partner and I were making design decisions, we first wanted to make our project a piece with a deep meaning. Influenced by my peers’ ideas which are all very “profound,” we also want to make our project a statement piece, or an experience that would educate people in the end. We proposed ideas like a game that would educate people on the unnecessary nature of social media and various other apps, but we struggled to create an engaging experience that would show our purpose. After consultation with Professor Marcela, my partner and I realized the more important thing about our project should be the experience itself. If we can create a new type of experience that is really interactive and engaging, we do not have to add some grand purposes to our project. Inspired by Marcela, we decided to focus on new forms of interaction – new experiences. Cathy suggested using the balance board for the user to stand on, and it turned out to be a great idea for our game by going beyond the usual keyboard experience. Our idea is to engage the entire body of the user, instead of merely part of his/her body like the fingers, so the balance board is really a great tool to realize that. Furthermore, the balance board is originally used in the gym, which adds another layer of meaning to our game – have fun while training your balancing ability! To suit the swinging nature of the balance board, we decided to use the sensor accelerometer to detect the user’s movement. We wanted to design a game that would require the user to constantly swing the board while keeping the balance – a fun way of interaction.
FABRICATION AND PRODUCTION:
In the production process, we first wanted to do the digital fabrication. At first my partner and I were thinking about making the balance board through 3D printing ourselves. But after consultation with Andy we realized the materials we have might not be so strong to sustain some users’ weight on the board. So we directly purchased a balance board and decided to do laser-cut a small box to put on the board. We hid the Arduino and sensor inside the box. The box serves as a protection from the user’s feet, which later in the user-testing was very important.
After the digital fabrication, we focused on testing the sensor and the code for the game. We were completely unfamiliar with accelerometer, but thanks to Tristan and Marcela’s help, we downloaded the guideline from the Internet and kept trying to make the sensor right. At first we were unable to detect the changes of the x and y values regardless of how we tried. We later discovered that it is because one port of the sensor was not working. After changing a new sensor and following the guideline, we managed to put the sensor in order.
Then we focused on designing the game. We wanted to use the round shape of the board in our game too, so we were thinking about the life-saving board in the sea at first. We also make the main figure a piglet from Winnie the Pooh. But later Marcela reminded us that we need to create a scenario for our game to better engage the user. So we created a space scenario where the balance board was a future vehicle in the space, and the piglet wearing a space helmet was exploring the space. Where the piglet goes is controlled by the user standing on the board – whichever direction he/she leans to, the piglet moves in the same direction. The moving board in the space is the safe board which the asteroids cannot hit. The piglet would love to have some fun outside the board, but it cannot stay outside for too long because it needs to breathe – after 5 seconds outside the board it dies from suffocation. Or if it gets hit by the moving asteroids directly, it dies instantly. In order to create the atmosphere of this scenario, we decided that the background should be the space. But inserting a image directly would cause the processing to progress too slowly. We tried to fix the problem but Tristan told us there is no easy solution. So I made a relatively simple but nice black background with shiny yellow stars scattered in it. This helps our game run smoothly. Thanks to my partner Cathy, we managed to make the code for the bouncing asteroids right. Although the sketch seems simple, it has the basic elements of a space game.
The user-testing session is very helpful. The users liked our idea of the moving board, and we collected much feedback for modification, such as the speed of the traveling piglet should be faster to make the game more engaging, the moving board can become smaller as time goes by, the moving boundary of the board can be expanded, and the placing of the wire should be switched to make it easy for the user to stand on. About the game concept, users also suggested we creating a game scenario that involves more elements, like bonus scores for “reaching” some stars for the piglet, and the showing of the time limit for the piglet staying outside the board. Marcela also suggested we adding sound to the game in our presentation, which adds a lot more to our game. We took almost all of this advice and modified our game base on that. The changes make our game more complete and engaging. One user also suggested making the fabricated box smaller and moving the Arduino onto the table instead of on the board, but we did not follow that because we had limited time and thought it was more important to focus on the game itself.
CONCLUSIONS:
The goal of my project is to create a new game experience that involves the movement of the user’s entire body. By standing on the balance board to control the movement of the piglet, the user tries to stay balanced as well as engage in the game to make sure the piglet survive as long as possible. The project aligns with my definition of interaction in that the user needs to constantly react to the changing location of the board and asteroids in the game to adjust the location of the piglet. What stands out in our project is that the extended experience is really novel – users have fun interacting with the game while shaking on the board. Ultimately, in the final show, many users tested our game and liked our idea of using the balance board. Some offered suggestions of improving the details of our game setting and code, but most of them find standing on the board very entertaining. One area we can keep working on is that the movement of the board becomes predictable after the user gets familiar with the game. We can randomize the movement of the board or speed it up – dividing the game into several rounds depending on the speed of the movement, so that the game can be more exciting and challenging.
In the production process, I came to realize the importance of exploring on our own. For example, the accelerometer sensor which I had no idea how to use can actually be mastered by self-learning online. The project can be complicated and what we learnt in class maybe insufficient, but we can always try to explore how things work ourselves and seek for help. The ability of self-learning is important in creating something of our own.
From the successes and failures of doing the final project, I also realized the importance of user “experience.” For both the project conception and the game design, we all need to create a scenario that serves the experience of the project. It is the experience that matters most for the user, and creating new forms of interaction really brings new experience and new feelings to the users. What we can further explore is how to inspire such new experiences, either by involving different parts of the human body like we did in our project or by other means. The combination of human body and new technology is very interesting and explores the new human need in the context of new technology today. Humans constantly want new forms of interaction with the technology to extend their conventional human experiences.