Space Piglet off Balance—Cathy Wang—Marcela

Space Piglet off Balance—Cathy Wang—Marcela
After several discussions with our professor, we decide to make a simple game with a different game experience. Under the inspiration of somatosensory game, we plan to create a game which needs our whole body to control. We have thought of using a camera to capture our movements, but we feel it will be too similar to the somatosensory game. Eventually, we choose a balance board that is usually used for work out. We need to use our whole body especially the lower half part to control the board. Then our project becomes a combination of game and exercise—work out while playing.
In User Testing Session, our project was still in a primary stage. What we had were just a blue background, a moving brown rectangle and a piglet controlled by us. We got lots of praise about our idea, especially for using a balance board. Meanwhile, our game was too simple with few instructions. In another word, users are confused at first and easily get bored when they master the game. Also, we do not have vocal response nor scores/time counting. It seems like we have the tools of a game, but we have not made the game a true game. Another more theoretical problem is that what the game talks about. We can’t randomly choose a game’s element. All things appear for some reason. We also need to build a connection between the physical tool and the virtual image. So, we change the shape of the brown rectangle into an ellipse with the same image as our balance board to make an echo. We also build a scenario for this game: a piglet flying in the space needs to stay on his flying machine to keep safe and avoid aerolite. In this way, we have a reason why the piglet is moving and what’s the relationship between the balance board and the game. We also change the background of a universe and add music to it. Originally, we wanted to use a picture as a background. However, after doing that, our game became too slow to be played. Then we made one by ourselves.
In the final presentation, we got a different inspiration for our project from a classmate’s comment. He said people use the balance board to help little kids practice their body-coordination and help their leg and knee joint to grow better. But they may get bored and refuse to do it. Our project transforms this “boring” equipment into a fun game, which may have a huge practical meaning. In IMA Final Show, we find that our instructions are still not clear enough. Some users thought they were supposed to control the ellipse (with the same image to the balance board) instead of the piglet. Therefore, we may need to consider clarifying how the game works or adapting the corresponding relation.
We believe a fun game needs to create a different interactive experience from other games. By adding an element of work-out, we combine kinematic movement with the visual game. We believe interaction starts at a physical level. Users are more likely to interact with more physical participation. Although we still need to improve in many details, I believe we succeed in most parts according to users’ reactions. Our project gives the game a different way to play while making a work-out tool a fun game.


// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
import processing.sound.*;
SoundFile sound;
String myString = null;
Serial myPort;

int NUM_OF_VALUES = 3;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/
PImage image;
PImage image1;
PImage image2;
float x=800;
float y=300;
float speedX;
float speedY;
int state=0;
float a=800;
float b=300;
float xspeed=2;
float yspeed=1;
int score = 0;
float e=700;
float f=700;
float xspeeed = 10;
float yspeeed = 6;
float xspeeeed = 5;
float yspeeeed = 9;
float g=200;
float h=700;
int round;

long gameoverTime = 0;
long falloffTime = 0;
boolean isPigOnDisc = true;
boolean isPigRock=true;
PImage bgImage;


boolean start=false;


void setup() {
  fullScreen();
  //size(1200, 900);
  //size(1200, 800)
  // bgImage = loadImage("Uni.jpg");
  image = loadImage("piglet.png");
  image1 = loadImage("rockL.png");
  image2=  loadImage("rockS.png");
  setupSerial();
  sound= new SoundFile(this, "super-mario-bros-theme-song.mp3");
  sound.loop();
}


void draw() {
  updateSerial();

  round++;

  if (start==true) {
    background(0);
    fill(#F5DE2C);
    pushMatrix();
    translate(width*0.2, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 5, 15, 3); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.4, height*0.7);
    rotate(frameCount / 200.0);
    star(0, 0, 10, 15, 5); 
    popMatrix();
    pushMatrix();
    
     
    
    translate(width*0.7, height*0.4);
    rotate(frameCount / 200.0);
    star(0, 0, 10, 15, 5); 
    popMatrix();

  

    pushMatrix();
    translate(width*0.5, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();

    pushMatrix();
    translate(width*0.2, height*0.2);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();
    
     pushMatrix();
    translate(width*0.5, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();

    pushMatrix();
    translate(width*0.15, height*0.8);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();

    pushMatrix();
    translate(width*0.8, height*0.5);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.5, height*0.1);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();

    pushMatrix();
    translate(width*0.5, height*0.3);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();

    pushMatrix();
    translate(width*0.1, height*0.1);
    rotate(frameCount / -100.0);
    star(0, 0, 5, 15, 5); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.6, height*0.5);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();
    
    pushMatrix();
    translate(width*0.8, height*0.2);
    rotate(frameCount / 200.0);
    star(0, 0, 3, 9, 3); 
    popMatrix();




  if (dist(x, y, a, b)<300) {
    isPigOnDisc = true;
  }
  // background(#98D2F7);
  // image(bgImage, 0, 0, width, height);
  if (millis() % 1000 < 10) {
    // check if the file is already playing
    score += 10;
  }

  textAlign(LEFT);
  textSize(30);
  //text("millis(): " + millis(), width-200, height/2-50);
  //text("millis() % 1000: " + millis()%1000, width-200, height/2+50);
  fill(255, 0, 0);
  text("Score: " + score, 50, 50);
  //println(millis());

  //background(#98D2F7);

  fill(#F2CE58);
  noStroke();
  ellipse(a, b, 500, 500);
  fill(#4D4C4C);
  noStroke();
  ellipse(a, b, 300, 300);
  fill(#F2CE58);
  noStroke();
  ellipse(a, b, 100, 100);

  a= a+xspeed;
  b= b+yspeed;
  if (round==500) {
    xspeed = 4;
    yspeed = 2;
  }

  if (a > width-250 || a <250) {
    xspeed = -xspeed;
  }
  if (b > height-250 || b <250) {
    yspeed = -yspeed;
  }

  image(image1, e, f, 135, 100);
  e= e+1.5*xspeeed;
  f= f+yspeeed;
  if (e > width || e <1 ) {
    xspeeed = -xspeeed;
  }
  if (f > height || f <0) {
    yspeeed = -yspeeed;
  }
  if ( dist(a, b, e, f)<300) {
    xspeeed = -xspeeed;
    yspeeed = -yspeeed;
  }

  image(image2, g, h, 135, 100);
  g= g+1*xspeeeed;
  h= h+yspeeeed;
  if (g > width || g <1 ) {
    xspeeeed = -xspeeeed;
  }
  if (h > height || h <0) {
    yspeeeed = -yspeeeed;
  }
  if ( dist(g, h, a, b)<300) {
    xspeeeed = -xspeeeed;
    yspeeeed = -yspeeeed;
  }

  //if ( f >= b -500 && f<=b+500) {
  //  yspeeed = -yspeeed;
  //}
  if (sensorValues[0]>-100 || sensorValues[0]<width+100) {
    speedX = map(sensorValues[0], 100, 400, -75, 75);
    x= x +1*speedX;
    x = constrain(x, 0, width-80);
    speedY = map(sensorValues[1], 100, 400, -75, 75);
    y = y-1.*speedY;
    y = constrain(y, 0, height-135);
  }
 }
image(image, x, y, 180, 200);



if (start==false) {
  fill(50);
  textSize(30);
  String s= "Stand on the balance board and press ENTER to start!                              Try to keep Piglet on the board and stay away from rocks!";
  
  fill(50);
  text(s, 500, 500, 1000, 100);
}
if (dist(x,y,e,f)<5) { //pig and rock
 background(0);
  fill(255);
  textSize(64);
  text("YOU DIED", width/2-100, height/2);
  score = 0;
    
 }
   
  
  //background(0);
  //fill(255);
  //textSize(64);
  //text("YOU DIED", width/2-100, height/2);
  //score = 0;
  //long gameoverTime = millis();
  //while (millis() - gameoverTime < 5000) {
  //  println("gameover");
  //  delay(10);
  //}
  //start = true;
  //x = width/2;
  //y = height/2;
  //println("Start Again!");
  ////delay(1000);
  ////start = false;
//}
if (dist(x, y, a, b)>350) { //when pig falls off the disc
  if (isPigOnDisc == true) {
    isPigOnDisc = false;
    falloffTime = millis();
  }
  if (millis() - falloffTime > 3000) {
    if (start == true) {
      gameoverTime = millis();
    }
    background(0);
    fill(255);
    textSize(64);
    text("YOU DIED", width/2-100, height/2);
    score = 0;
    start = false;

    if (millis() - gameoverTime > 5000) {

      start = true;
      x=500;
      y=500;
      a=500;
      b=500;
      e=700;
      f=700;
      g=200;
      h=700;
    }


    //println("Start Again!");
  }
}
}

//void gameOver() {
//  background(0);
//  fill(255);
//  textSize(64);
//  text("YOU DIED", width/2-100, height/2);
//  x=500; //pig
//  y=200; //pig
//  a=650; //disc
//  b=300; //disc
//  start = false;
//  score = 0;
//  e=100; //shark
//  f=100; //shark
//  delay(1000);
//}

void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

void keyPressed() {
  if (key == ENTER) {
    start = true;
  } else {
    start = false;
  }
}
void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Space Piglet Off Balance – Feifan Li – Marcela 

CONCEPTION AND DESIGN:

When my partner and I were making design decisions, we first wanted to make our project a piece with a deep meaning. Influenced by my peers’ ideas which are all very “profound,” we also want to make our project a statement piece, or an experience that would educate people in the end. We proposed ideas like a game that would educate people on the unnecessary nature of social media and various other apps, but we struggled to create an engaging experience that would show our purpose. After consultation with Professor Marcela, my partner and I realized the more important thing about our project should be the experience itself. If we can create a new type of experience that is really interactive and engaging, we do not have to add some grand purposes to our project. Inspired by Marcela, we decided to focus on new forms of interaction – new experiences. Cathy suggested using the balance board for the user to stand on, and it turned out to be a great idea for our game by going beyond the usual keyboard experience. Our idea is to engage the entire body of the user, instead of merely part of his/her body like the fingers, so the balance board is really a great tool to realize that. Furthermore, the balance board is originally used in the gym, which adds another layer of meaning to our game – have fun while training your balancing ability! To suit the swinging nature of the balance board, we decided to use the sensor accelerometer to detect the user’s movement. We wanted to design a game that would require the user to constantly swing the board while keeping the balance – a fun way of interaction.

FABRICATION AND PRODUCTION:

In the production process, we first wanted to do the digital fabrication. At first my partner and I were thinking about making the balance board through 3D printing ourselves. But after consultation with Andy we realized the materials we have might not be so strong to sustain some users’ weight on the board. So we directly purchased a balance board and decided to do laser-cut a small box to put on the board. We hid the Arduino and sensor inside the box. The box serves as a protection from the user’s feet, which later in the user-testing was very important.

After the digital fabrication, we focused on testing the sensor and the code for the game. We were completely unfamiliar with accelerometer, but thanks to Tristan and Marcela’s help, we downloaded the guideline from the Internet and kept trying to make the sensor right. At first we were unable to detect the changes of the x and y values regardless of how we tried. We later discovered that it is because one port of the sensor was not working. After changing a new sensor and following the guideline, we managed to put the sensor in order.

Then we focused on designing the game. We wanted to use the round shape of the board in our game too, so we were thinking about the life-saving board in the sea at first. We also make the main figure a piglet from Winnie the Pooh. But later Marcela reminded us that we need to create a scenario for our game to better engage the user. So we created a space scenario where the balance board was a future vehicle in the space, and the piglet wearing a space helmet was exploring the space. Where the piglet goes is controlled by the user standing on the board – whichever direction he/she leans to, the piglet moves in the same direction. The moving board in the space is the safe board which the asteroids cannot hit. The piglet would love to have some fun outside the board, but it cannot stay outside for too long because it needs to breathe – after 5 seconds outside the board it dies from suffocation. Or if it gets hit by the moving asteroids directly, it dies instantly. In order to create the atmosphere of this scenario, we decided that the background should be the space. But inserting a image directly would cause the processing to progress too slowly. We tried to fix the problem but Tristan told us there is no easy solution. So I made a relatively simple but nice black background with shiny yellow stars scattered in it. This helps our game run smoothly. Thanks to my partner Cathy, we managed to make the code for the bouncing asteroids right. Although the sketch seems simple, it has the basic elements of a space game.

The user-testing session is very helpful. The users liked our idea of the moving board, and we collected much feedback for modification, such as the speed of the traveling piglet should be faster to make the game more engaging, the moving board can become smaller as time goes by, the moving boundary of the board can be expanded, and the placing of the wire should be switched to make it easy for the user to stand on. About the game concept, users also suggested we creating a game scenario that involves more elements, like bonus scores for “reaching” some stars for the piglet, and the showing of the time limit for the piglet staying outside the board. Marcela also suggested we adding sound to the game in our presentation, which adds a lot more to our game. We took almost all of this advice and modified our game base on that. The changes make our game more complete and engaging. One user also suggested making the fabricated box smaller and moving the Arduino onto the table instead of on the board, but we did not follow that because we had limited time and thought it was more important to focus on the game itself.

CONCLUSIONS:

The goal of my project is to create a new game experience that involves the movement of the user’s entire body. By standing on the balance board to control the movement of the piglet, the user tries to stay balanced as well as engage in the game to make sure the piglet survive as long as possible. The project aligns with my definition of interaction in that the user needs to constantly react to the changing location of the board and asteroids in the game to adjust the location of the piglet. What stands out in our project is that the extended experience is really novel – users have fun interacting with the game while shaking on the board. Ultimately, in the final show, many users tested our game and liked our idea of using the balance board. Some offered suggestions of improving the details of our game setting and code, but most of them find standing on the board very entertaining. One area we can keep working on is that the movement of the board becomes predictable after the user gets familiar with the game. We can randomize the movement of the board or speed it up – dividing the game into several rounds depending on the speed of the movement, so that the game can be more exciting and challenging.

In the production process, I came to realize the importance of exploring on our own. For example, the accelerometer sensor which I had no idea how to use can actually be mastered by self-learning online. The project can be complicated and what we learnt in class maybe insufficient, but we can always try to explore how things work ourselves and seek for help. The ability of self-learning is important in creating something of our own.

From the successes and failures of doing the final project, I also realized the importance of user “experience.” For both the project conception and the game design, we all need to create a scenario that serves the experience of the project. It is the experience that matters most for the user, and creating new forms of interaction really brings new experience and new feelings to the users. What we can further explore is how to inspire such new experiences, either by involving different parts of the human body like we did in our project or by other means. The combination of human body and new technology is very interesting and explores the new human need in the context of new technology today. Humans constantly want new forms of interaction with the technology to extend their conventional human experiences.

Final Project Reflection: Snack Facts by Eleanor Wade

Snack Facts – Eleanor Wade – Marcela Godoy

CONCEPTION AND DESIGN:

When considering how my users were going to interact with this project I took much of my research regarding the consumption of animals and animal products, as well as the typical experience of a grocery store in mind.  In order to recreate a similar feeling of the vast array options that consumers are presented with at a super market, I chose to use the color scanner and foods with various colored tags to allow for users to have the experience of checking out at a typical store. Following making their decisions and selecting different products from the shelves (specifically meats with red tags, animal products with blue tags, and plant-based foods with green tags) users would scan to see an overwhelming assortment of pictures and quick facts regarding the process from an industrialized factory farm to table, and the differing environmental impacts each of these actions have.  The color sensor was critical in my design and conception of this experience because it is not only a hands-on and interesting action, but also it can be very clearly linked to the overall feeling of checking out at a grocery store.  It is my hope that many of my users will associate this feeling of blindly making decisions with the pictures that appear on the screen.  While the shelves were made of cardboard, I also included many collected, plastic packages that are commonly used in grocery stores.  This helped to further explore the question of how we process our foods and package them for our convenience, without fully understanding what the consequences of these choices are. Other materials I used, such as real foods (carton of milk, jam, bread, sausages, cookies) were an effort to make the experience appear slightly more realistic.  Additionally, the few edible foods I provided were very beneficial in working to complete the experiences and add an interactive element of taste and smell to the project.  It was evident that these materials–particularly the real, edible foods– were central to the interactive aspect of my project because in addition to using the color sensor, being presented with both plant based and animal based products further made customers question the choices they make everyday.  In associating this specific taste with the exposed realities of the food systems, this project used levels of interactivity to educate people about the environmental impacts of their food choices.  

FABRICATION AND PRODUCTION:

The most significant steps in my production process started with building off of my previous research about animal products and talking with Marcela regarding the best ways to move forward with how to create an interactive and educational experience involving food.  After deciding to use the color sensor, I used my research from a previous recitation using this sensor to work through the Arduino to Processing communication.  In working on the coding of both, and furthering the project by adding a collage of photos from my research, Marcela was exceptionally helpful to me.  I definitely struggled with how to translate the specific numerical values associated with each color and how to connect this to groups of photos.  User testing proved to be very beneficial to me because I was able to engage with users as they were experiencing my project, as well as receive feedback such as the problems with clarity of the text (so later I changed this to only pictures, rather than facts) as well as the speed of the shifting pictures.  Users/”customers” also commented on the action of selecting individual products to scan, as well as the role that edible foods played in the entire understanding of interactivity in my project.  Because of this, I made an effort to select real foods that would be pertinent to the decisions that we make regarding our every meal.  In terms of justifying these aspects of the design, using sample sized foods also supported the various free samples that are commonly found at grocery stores. While the many changes I made to my project following user testing were effective, I think it would have been even better to clarify the images I used, in addition to fixing the distortion, however even after making many different alterations, this was especially difficult.

Digital Fabrication:

3D printing:  https://www.thingiverse.com/thing:2304545

I decided to create a 3D printed mushroom because it represents the produce that is commonly found at a grocery store or supermarket.  I wanted to 3D print rather than laser cut something because I found it relatively easy and beneficial to be able to make shelves out of cardboard, as well as the scanner that contains the Arduino and breadboard for the color scanner.  

CONCLUSIONS:

The primary focus of my project is to educate people on the larger consequences and implications of their food choices.  Through the interactive concept of using a scanner to trigger images specific to food production, I hope to demonstrate the consequences of dietary choices and the larger implications that surround industrialized agriculture and animal farming.  The results of my project align with my definition of interaction because not only do users get to engage with a supermarket-checkout-style scanner, but also they are presented with real, edible foods to further the understanding of what you eat matters. This response from seeing unpleasant or informative images helps to further the elements of interaction in that users both learn something new and associate these facts with the foods they consume regularly.  If I had more time, I would improve my project by fixing the distortion of the images, and by adding sound–specifically the screams of animals living on factory farms as well as a sound that is made after each scan to demonstrate the actions– in order to engage audiences in the experience of the project on a more complete level.  This project has taught me many valuable components, for example the potential that technology and design have for enhancing our understandings of the world and shifting ideologies on even the most basic aspects of life, such as food.  When users are able to experience projects that appeal to more than just one sense, it also enhances the project overall.  Regarding my accomplishments on this project, I am pleased to have been able to use creative technology to be able to introduce people to the realities of food systems that they may have otherwise been very disconnected from.  Ultimately, this project uses visual cues combined with senses such as taste and smell to demonstrate not only compelling methods of interaction, but also help to bridge the gap that we have from how our food is produced.  Audiences and customers should care about this project and experience because it demonstrates the exceptionally detrimental consequences of eating animals and animal products, and translates these very common interactions with food and at grocery stores into more tangible and straightforward pieces of information.  

BIBLIOGRAPHY OF SOURCES:

“5 Ways Eating More Plant-Based Foods Benefits the Environment.” One Green Planet, 21 Aug. 2015, https://www.onegreenplanet.org/environment/how-eating-more-plant-based-foods-benefits-the-environment/.
https://search.credoreference.com/content/entry/abcfoodsafety/avian_flu/0. Accessed 29 Oct. 2018.
“Dairy | Industries | WWF.” World Wildlife Fund, https://www.worldwildlife.org/industries/dairy. Accessed 4 Dec. 2019.
Eating Animals Quotes by Jonathan Safran Foer. https://www.goodreads.com/work/quotes/3149322-eating-animals. Accessed 3 Dec. 2019.
Flu Season: Factory Farming Could Cause A Catastrophic Pandemic | HuffPost. https://www.huffingtonpost.com/kathy-freston/flu-season-factory-farmin_b_410941.html. Accessed 29 Oct. 2018.
“Milk’s Impact on the Environment.” World Wildlife Fund, https://www.worldwildlife.org/magazine/issues/winter-2019/articles/milk-s-impact-on-the-environment?utm_campaign=magazine&utm_medium=email&utm_source=magazine&utm_content=1911-e. Accessed 4 Dec. 2019.
Moskin, Julia, et al. “Your Questions About Food and Climate Change, Answered.” The New York Times, 30 Apr. 2019. NYTimes.com, https://www.nytimes.com/interactive/2019/04/30/dining/climate-change-food-eating-habits.html, https://www.nytimes.com/interactive/2019/04/30/dining/climate-change-food-eating-habits.html.
Nijdam, Durk, et al. “The Price of Protein: Review of Land Use and Carbon Footprints from Life Cycle Assessments of Animal Food Products and Their Substitutes.” Food Policy, vol. 37, no. 6, Dec. 2012, pp. 760–70. DOI.org (Crossref), doi:10.1016/j.foodpol.2012.08.002.
Ocean Destruction – The Commercial Fishing Industry Is Killing Our Oceans. http://bandeathnets.com/. Accessed 3 Dec. 2019.
Siegle, Lucy. “What’s the Environmental Impact of Milk?” The Guardian, 13 Aug. 2009. www.theguardian.com, https://www.theguardian.com/environment/2009/aug/07/milk-environmental-impact.
“The Case for Plant Based.” UCLA Sustainability, https://www.sustain.ucla.edu/our-initiatives/food-systems/the-case-for-plant-based/. Accessed 4 Dec. 2019.
The Ecology of Disease and Health | Wiley-Blackwell Companions to Anthropology: A Companion to Medical Anthropology – Credo Reference. https://search.credoreference.com/content/entry/wileycmean/the_ecology_of_disease_and_health/0. Accessed 29 Oct. 2018.
“WATCH: Undercover Investigations Expose Animal Abusers.” Mercy For Animals, 5 Jan. 2015, https://mercyforanimals.org/investigations.
What Is The Environmental Impact Of The Fishing Industry? – WorldAtlas.Com. https://www.worldatlas.com/articles/what-is-the-environmental-impact-of-the-fishing-industry.html. Accessed 3 Dec. 2019.
Zee, Bibi van der. “What Is the True Cost of Eating Meat?” The Guardian, 7 May 2018. www.theguardian.com, https://www.theguardian.com/news/2018/may/07/true-cost-of-eating-meat-environment-health-animal-welfare.

Who’s Ordering your Food – Xueping Wang – Marcela

Project Photo
Project Photo

Codes: Arduino + Processing

We wish our users do not interact with our project simply by clicking some buttons  only but get more involved and think during the process when they are interacting with it. Therefore, we add more sensory experiences during the interaction to get users more focused and interested. Images, videos and sound responses are presented with this aim. We also tried to choose or create images which are bright-colored and kind of animated so that they look more inviting. Movement of images during the third scenario about news both help attract attention and resembles how people encounter news that massive information show up and change all the time. We could have use more words or one simple scenario but we think that might become more responsive instead of interactive and users will lose interest. The  mode that letting users go through three scenarios and make choices before teaching them anything makes results more convincing and provokes thoughts in this process too.

Video of the whole interaction process (shoot in the final show)

The most important part in our production is creating the three scenarios in processing and creating indications to tell users what they are supposed to do without being too direct. Since we want our users to make their food choice in a social context similar to reality to see how social media might have influenced their food choice so that they learn to be more aware in their later lives, we first let them choose dishes for a meal and then after being primed with video and news/advertisements, they are going to do the choices again. The menu is not made in processing but as fabrication also because we intended to make it look more like a real menu with images of the food combination which we assume can be more interesting and inviting. Nevertheless, since it is now a box-like thing, I personally feel it does not fit our plan and I do agree with some comments that maybe making it look more like a real menu or having other buttons all on this menu helps improving this project.

Initial Overall Plan
Initial Overall Plan
Detailed Plan of the menu and related news in the third scenario
Detailed Plan of the menu and related news in the third scenario

Our whole research & production process documented in doc

Because we spent too much time coding the third scenario and dealing with all images, videos and audios; we only have our third scenario for user test. We are very glad that most our users get the message we want to convey by this project before we explain anything to them. Their suggestion mainly fall into two categories: one is adding more instructions so they will not feel lost and the other is having a explicit feedback at the end. Although we already planned to do similar things, some detailed comments are really helpful. One advice we followed is to develop on the character Lisa we used originally for the front page only and to create a more conversational guidance so that all three scenarios are connected fluently as a full diet story of Lisa. The videos and “fake” news reports are also embedded into this story so users will not get a wired feeling why they need to do this before choice (since during the user test some users especially male users are impatient and skip everything they thought unnecessary before making their choice which ruins the experience and cannot get the whole idea of the project). 

Conversational Guidance example
example of Conversational Guidance to connect scenarios and the social media

This adaption seems to receive welcomed feedback as now most users report they feel the whole experience makes sense and they are willing to go through this project scenario by scenario. It also resembles how people interact with social media in reality and makes them more aware next time when confronted with these information relating to food choice. We also complete our “report” session and inserts a reset button as response to certain suggestions we get from user test.

Report for those who change choices after primed with social media (with a reset button)
Report for those who change choices after primed with social media

After we fixed all bugs in codes and are almost complete, we invited some of our friends and classmates to test our project again and there are still some confusions about which key they should press to continue and sometimes they get lost where the menu is so we change some of the guidance again with red words hoping users can see the emphasized part at least and put on notes to inform them again. However, as we have seen in the presentation, the effect is limited.

Changed version of instruction
Changed version of instruction emphasizing the need to repeated use of “v”
Final Project with notes emphasizing the place of Menu and the red words
Final Project with notes emphasizing the place of Menu and the red words 

The goals of this project is to call for awareness that our current food choice is so easily influenced by all kinds of social media and information we receive no matter it is in form of live-streaming, Youtube videos, news, scientific reports; or it is in essence for educational purpose, for advertising purpose or purely recommendations. I have friends who are easily influenced by the fashion trends to be fit or by advertising articles about new restaurants or popular snacks so some of them develop eating disorders or have obesity problems. So I wish this project would help arouse awareness and awareness certainly affects decisions. 

Our project overall align with my definition of interaction that it creates a personal experience since it is aimed to be used by a single person and his/her choices combined with the specific response considering those choices creates a unique experience for that user. And they need to take in various information and process different sensory experience into thoughts to make decisions. Something not so align with our definition of interaction is that we only involved buttons as a way to interact with the scenarios which is kind of a very simple physical interaction. We do wish more interesting ways of receiving information could be used such as having something similar to a POS to print out the result which is both interesting and creates the context of a restaurant too but that is hard to put into practice. 

User interact with this project on final show
User interact with this project on final show

Nevertheless, we are still glad that during most user tests and in the presentation users can go through with the scenarios with interest and after their experience, they all give some feedbacks about their current feelings on the impact of social media. Most of people who change their choices after being primed by videos or news/advertisements report that they never realize social media has such influence on them. And we are also happy to see some people are not influenced by social media. For these users, they are also satisfied to see there are responsive feedback for them. Since our project does not aim at telling users what information are correct and what are not valid but to let them be more aware of all information that might change their choice or behavior in some ways, it is great to see different results but all get the expected reflection on their previous choices. I am also surprised to learn some stories from users who had been impacted by the exact news or advertisement we choose to include here (three users report to have used the SlimFast shake because of its believed effect). And I even get several feedbacks in the final show that suggest me to use this project to have a social research in this area.

If we have more time, first thing we would do is to fix the problem in instruction to be clearer and apparent. We might create some more buttons on the menu so there is no need to ask people to press certain key which is somehow confusing and we will try to make something really look like a menu. Another thing we have wanted to do is to change the third scenario to have two versions. Some people report in the user test that there are too many information and they lose interest in the long “lectures” of news. So we were considering to have two different ways to read the news, one is the detailed version which is the current one and the other is a brief “skim” version which has all the news and big titles moving while the voiceover will be a combined soundtrack with varied voices telling different news or slogans. The “skim” version will try to create the effect of a “explosion of information” to  the user so that even if they are not patient to see the news one by one, they get the idea and some big titles. Also, if I have more time, I wish I could get the IRB approve and use this project as a device to observe and record people’s food choices after being primed with social media and research on this topic.

The most important thing we learn from the previous mentioned failures apart from coding is to think from the user side. Because of the fact that we are the people who have researched a lot in this topic and have a general knowledge what our project is and what is the aim, sometimes the “common knowledge” or “things easy to figure out” in our minds are not common sense at all for users which confuses them and make them feel lost what to do. Also an important thing is to target a specific audience group and keep that in mind while deciding certain interactive ways, the context where the project will be shown or presented is also something needs to be think of in order to create a coercive experience. But having a specific audience group does not mean they will perform or react in similar ways. We have observed how users interact with our project during the user test and be amazed at the different ways they interact. So if we do want everyone to behave in similar ways (e.g. in our project we want everyone at least to go through the three scenarios), there should be a explicit guidance or there should be ways to restrict users’ control over the project (e.g. set timer to restrict users from skipping everything). And differences should be allowed and valued so that there need to be “responses” toward the users’ different choice and decision.

User interact with the project on final show
User interact with the project on final show

Another really essential thing to keep in mind as learned from all work done in 826 is that never assume any job/work to be easy and simple. Whenever I assume some part of coding is easy and believe I can finish in minutes there turns out to be bugs… Like I think the second scenario will be the easiest to code because it is purely movie playing and the other interaction is the same as in the first scenario which I can copy. But there turns to be errors which later we find out is due to the fact that mov. does not work properly in Processing. And then more problems pop up such as being shown as an image instead of movie and that  images supposed to stay for a while just flashes.  So every part of the work needs to be carefully dealt with and when you arrange the plan there should be left enough time to accomplish each step.

Our project aims at raising awareness of impact of social media on food choice. Studies have shown that choice of food is essential for both physical health and psychology wellbeing. The blind choice of using some food to lose weight in a short time might lead to Bulimia or other eating disorders as well as depression or anxiety while the inordinate consumption of certain foods causes health issues like obesity or heart-diseases. In recent years, with the development of social media such as live streaming, photos, online short videos or other ways which might be used as ways to popularize some product or fashion; the ideology of regarding “thin” as pretty prevails and the massive information about certain food products and their pros and cons. These things ultimately alter people’s choices either in the way of leading to the expected way or causing opposite results. With our project, users experience themselves how some social media works to influence their choice unconsciously. And this awareness they get after this experience will help them follow their own heart for decisions instead of easily get affected by information around them.