COLORFUL BOARD

Project Title: Colorful Land

Designer: Yuxing Li

Conception and Design

     The project is based on an original game called Candy Land, which is a multi-player board game for 2-4 players. I planned to add more interactive parts to this game to make it more playable and interesting. I designed specific game mechanisms such as LED lights, music, popping ice cream, and candy falling from the sky.

     The primary objective of this project is to provide students with a valuable opportunity to unwind and find respite amid their hectic lives through the medium of gaming. By incorporating a carefully designed game into their routine, we aim to offer a momentary escape from the pressures and demands of academic and personal commitments. This endeavor recognizes the significance of balance and well-being in a student’s life, acknowledging that leisure activities can contribute to overall happiness and productivity.

       Through engaging gameplay and captivating features, the project aspires to create an immersive and enjoyable experience for students.  By offering a brief respite from their daily responsibilities, the game serves as a means of relaxation, fostering a sense of rejuvenation and mental clarity.  It provides students with an avenue to temporarily detach from the stresses of their studies, allowing them to recharge and return to their tasks with renewed focus and energy.

        In summary, the primary aim of this project is to prioritize the well-being of students by providing them with a means to momentarily escape their busy lives through a thoughtfully designed game.  By offering a source of relaxation, mental rejuvenation, and skill development, the project seeks to contribute positively to the overall academic and personal journeys of the students involved.

Fabrication and Production

Before the user testing:

         In the beginning, I sketched the board by hand. This hands-on approach allowed me to visualize the initial framework of the game and lay the foundation for its development.My initial idea was to design students doing assignments as a backstory for the game, linking the interactive part of the game to the grading of assignments and learning progress in the real world.

       In the user testing, I tried to print out and weld the hand-drawn board, connect the pieces to the board circuit through metal tape, make the code work, and play a video to show the player what to do next.  I also used the 3D print to print those chess and decorations.

After the user testing:

       I received suggestions for adding interactive fixtures and improving the look of the project, so I decided to redesign the board and laser cut it to make it look more like a board game, as well as re-solder the wires and metal tape.

       And I decided to add a ribbon of lights. When the pieces connect the circuit, the pixel will change from the original bright rainbow color to the rolling color.

Processing:

       After the appearance was completed, I began to write the code. First, I needed to write a processing code to connect the chess pieces to a specific grid, and after the connection, the corresponding video would appear on the computer screen, which might make the players move forward, move back, or stay for a round. So I hand-painted 11 different videos and matched them with different pin codes.

 

At the same time, I chose Mario as the game’s victory interface.

For arduino, I coded the pixel so that it could scroll through colors after each piece connected.

Final look:

 
 

Conclusions:

       In the process of the final presentation, the lamp belt of my project could not light up normally. After checking, it was found that it was a connection problem. However, due to the excessive number of wires, I could not find out which wire was disconnected. During the presentation process, the video often got stuck due to the excessive number of videos. The above two points are where I need to modify and improve.

       At the same time, I should enrich the narratology of my board game, for example, make the background story related to the homework more closely related to my final project. For example, some grid players will fail in the exam, and some will get A in the final exam.

The cleaned project:
 

Code: 

processing:

import processing.serial.*;
import processing.video.*;
Movie movie1;
Movie movie2;
Movie movie3;
Movie movie4;
Movie movie5;
Movie movie6;
Movie movie7;
Movie movie8;
Movie movie9;
Movie movie10;
Movie movie11;
Serial serialPort;

int NUM_OF_VALUES_FROM_ARDUINO = 12;  /* CHANGE THIS ACCORDING TO YOUR PROJECT */

/* This array stores values from Arduino */
int arduino_values[] = new int[NUM_OF_VALUES_FROM_ARDUINO];

PImage img;
color[] colors = new color[5];
int option;
boolean isBackgroundVisible = true;

void setup() {
  fullScreen();
  size(720, 480);
  printArray(Serial.list());

  img = loadImage("win.png");

  colors[0] = color(#4FE561);
  colors[1] = color(#3AA2F0);
  colors[2] = color(#EFF03A);
  colors[3] = color(#F04333);
  colors[4] = color(#A022F0);

  frameRate(30);
  movie1 = new Movie(this, "movie1.mov");
  movie1.loop();
  movie2 = new Movie(this, "movie2.mov");
  movie2.loop();
  movie3 = new Movie(this, "movie3.mov");
  movie3.loop();
  movie4 = new Movie(this, "movie4.mov");
  movie4.loop();
  movie5 = new Movie(this, "movie5.mov");
  movie5.loop();
  movie6 = new Movie(this, "movie6.mov");
  movie6.loop();
  movie7 = new Movie(this, "movie7.mov");
  movie7.loop();
  movie8 = new Movie(this, "movie8.mov");
  movie8.loop();
  movie9 = new Movie(this, "movie9.mov");
  movie9.loop();
  movie10 = new Movie(this, "movie10.mov");
  movie10.loop();
  movie11 = new Movie(this, "movie11.mov");
  movie11.loop();


  serialPort = new Serial(this, "/dev/cu.usbmodem101", 9600);
}

void draw() {

  noStroke();
  if (keyPressed) { //button pressed, then pick random color
    if (key == 'c') {
      fill(colors[int(random(0, 5))]);
      rect(0, 0, width, height);
    }
  }

  stroke(255);
  fill(255);
  getSerialData();

  if (arduino_values[2] == 1) {
    if (movie1.available()) {
      movie1.read();
    }
    pushMatrix();
    scale(-1, -1);
    image(movie1, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[3] == 1) {
    if (movie2.available()) {
      movie2.read();
    }
    pushMatrix();
    scale(-1, -1);
    image(movie2, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[4] == 1) {
    if (movie3.available()) {
      movie3.read();
    }
    pushMatrix();
    scale(-1, -1);
    image(movie3, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[5] == 1) {
    if (movie4.available()) {
      movie4.read();
    }
   pushMatrix();
    scale(-1, -1);
    image(movie4, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[6] == 1) {
    if (movie5.available()) {
      movie5.read();
    }
   pushMatrix();
    scale(-1, -1);
    image(movie5, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[7] == 1) {
    if (movie6.available()) {
      movie6.read();
    }
    pushMatrix();
    scale(-1, -1);
    image(movie6, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[8] == 1) {
    if (movie7.available()) {
      movie7.read();
    }
    pushMatrix();
    scale(-1, -1);
    image(movie7, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[9] == 1) {
    if (movie8.available()) {
      movie8.read();
    }
   pushMatrix();
    scale(-1, -1);
    image(movie8, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[10] == 1) {
    if (movie9.available()) {
      movie9.read();
    }
    pushMatrix();
    scale(-1, -1);
    image(movie9, -width, -height, width, height);
    popMatrix();
  }
  if (arduino_values[11] == 1) {
    if (movie10.available()) {
      movie10.read();
    }
   pushMatrix();
    scale(-1, -1);
    image(movie10, -width, -height, width, height);
    popMatrix();
  }


  if (arduino_values[10] == 1) {
    option = 10;
  }
  if (option == 10) {
    win();
  }
}

void getSerialData() {
  while (serialPort.available() > 0) {
    String in = serialPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
    if (in != null) {
      print("From Arduino: " + in);
      String[] serialInArray = split(trim(in), ",");
      if (serialInArray.length == NUM_OF_VALUES_FROM_ARDUINO) {
        for (int i=0; i<serialInArray.length; i++) {
          arduino_values[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

void win() {
  if (isBackgroundVisible) {
    img = loadImage("win.png");
    image(img, 0, 0, width, height);// 设置背景颜色为白色
    fill(#FAF312);  // 设置文本颜色为黑色
  } else {
    img = loadImage("win.png");
    image(img, 0, 0, width, height);
    ;  // 设置背景颜色为黑色
    fill(#12FAEC);  // 设置文本颜色为白色
  }

  // 在屏幕中央绘制文本
  text("You Win!!", width/2, height/2);

  // 闪烁背景
  if (frameCount % 30 == 0) {
    isBackgroundVisible = !isBackgroundVisible;
  }
}

void mousePressed() {
  // 当鼠标点击屏幕时,关闭窗口
  exit();
}

Midterm Project: Cloud – Yuxing Li – Andy

 MY PREVIOUS PROJECT

         My previous project is about a machine which can scan and change people’s emotions, and use pictures to make them get rid of the low mood.Therefore, in this project, we also designed a tool that can change mood. We designed it to look like a musical instrument. People can control it by themselves and play some soothing music to improve their angry or anxious mood.

         From those projects,  first, I think it is an innovation that uses different tools to realize imagination, and second, it uses devices to interact with humans while having some impact on them.

        What’s unique about our project is that we don’t use a buzzer to make the instrument sound, but a DC motor to play the wind chimes to make the music.

         Our project focuses on those people who are anxious and angry. When they pinch the sensor of the project, it will start the DC motor to play the wind chimes, which can produce relaxing music to achieve the effect of soothing the mood.

CONCEPTION AND DESIGN

         

         We designed the user to vent their anxiety through squeezing sensors and receive feedback with soothing music. So we took an analog sensor and wrapped it in cotton, and the user started the dc motor by squeezing the cotton, and the squeezing action was a kind of pressure release.

 

          We designed the device as a musical instrument, and we used an analog sensor to sense the user’s pressure.In the appearance design of the project, we used cotton to wrap the exterior, and blue and white fabrics to create a relaxed, soft and soothing environment.

          Our choice of materials was based on the fact that we wanted the project to create a relaxing environment.

          The analog sensors are the best choice, because they can intuitively translate the user’s anxiety and anger into the movement of the dc motor, which makes the wind chimes sound.

          Our other option was a microphone sensor, but we rejected this option because some users may not want to vent their anxiety and anger through Shouting, and they will prefer a more internal method of catharsis.

FABRICATION AND PRODUCTION

          The most significant steps in our production process is the material choosing.At the beginning, we only prepared one wind chime, but it was difficult to make a pleasant sound and a little monotonous, so we increased the number of wind chime to four, and added bells to make our instrument sound more diversified.

         

         We also had a problem with how to make the wind chimes sound. At first, we planned to use a fan to blow the wind chimes, but we couldn’t make the wind chimes sound due to the lack of wind power. After that, we tried to hit the middle part of the wind chimes with a DC motor, but the sound was too rapid to achieve the effect of soothing the mood.

         

         Finally, we chose to install the DC motor on the top of the rope of the wind chime to rattle the wind chime to make it sound. And we use an analog sensor instead of the previous knob to make the user’s operation more decompressed.

       

         In the process, I was responsible for designing and building the overall shape and decoration of the project, and selecting appropriate sensor  to fit the theme. In the process of teamwork, we discussed the selection and debugging of sensors together, and assembled the circuit into the built external structure together in the final finishing work.

          In the user test, our project has not been completed yet. Our initial design is for users to hold electric fans to strike the wind chimes to make them sound. However, after the user test, we found that this method is not safe and it is difficult for users to find the right percussion position, and the sound produced is not soothing enough. So then we chose to attach the motor to the top of the project, and the user just needs to squeeze the sensor.

          In the appearance design of the project, we initially only planned to hang the wind chimes on the cardboard, but another group of users suggested that we could add a window or screen to hide the wind chimes, which would look more like Musical Instruments.

CONCLUSIONS

          The goal of our project is to design an instrument that can play soothing music independently to adjust the user’s anxiety and anger, and achieve the purpose of being interactive while the user squeezes the sensor.

          Users can play soothing music on their own, and when they squeeze the sensor they receive wonderful music back, which improves their mood.

          If we have more time, we plan to add more types of instruments to make the music even more beautiful. And we want to add more controllers to give users more freedom to play their own music.

 

OUR CODE

#include <Servo.h>

//**** servo 1 settigns

Servo servo1;

const int servo1SensorPin = A0;

const int servo1Pin = 9;

int servo1Value;

//**** servo 1 settings END

//**** servo 2 settings

Servo servo2;

const int servo2SensorPin = A1;

const int servo2Pin = 10;

int servo2Value;

//**** servo 2 settings END

int servo1map;

int servo2map;

void setup() {

  Serial.begin(9600);

  servo1.attach(9);

  servo2.attach(10);

}

void loop() {

  servo1Value = analogRead(servo1SensorPin);

  delay(10);

   servo1Value = analogRead(servo1SensorPin);

   delay(10);

  servo1map = map(servo1Value, 0, 1023, 0, 180);

  servo1.write(servo1map);

    Serial.println(servo1Value);

  servo2Value = analogRead(servo2SensorPin);

  delay(10);

    servo2Value = analogRead(servo2SensorPin);

    delay(10);

    servo2map = map(servo2Value, 0, 1023, 0, 180);

  servo2.write(servo2map);

  delay(15);

}

CJ’s Project 1- Drooz

Our Artifacts

     The name of our artifact is called ‘Drooz’, a device that can scan people’s emotions and cure them, but it is kind of a technique that only works on people who believe it.

      From my perspective, the success of our project is dependent on our performance prop designs, all of the properties are highly related to our background story and they are connected pretty well, which provide us a complete performance. Also, I reckon that our background story is also interesting, based on the second given story, we gave our group’s solution, which is a device like this to cure those bad emotions, and the two side of the techniques we showed also represented the positive and negative effects of today’s technology. The live music also made our representation more

interactive. So it did relate to the established form of interaction I identified in my research.

     Talking about the failures, I think the point is our performance. Due to the time limit, we cut off some of the conversation, so the story might not seem so abundant. And this is the change we need to make.

Other’s Artifact

     My most impressive artifact was group D’s ‘The Suit’, and it is based on the third story, they created an armor which can protect people from being hurt and become stronger. Their performance was really interesting and fun, also super creative, but I reckon that their artifact might lack interactive connections, so maybe they can try to improve it next time.

     The teamwork dynamics, The different roles, The task allocations

     The idea of our project came from Luna, and she also painted the healing picture.Jean and Sylvia created the box (machine) and colored them. Ellen wrote the first version of our script. Fred practiced healing music. And I created the scanner.

  • Below is our Script roles:

(Ellen) Patient 1- sad 

(Jean) Patient 2-angry/anxious

(Sylvia) Doctor 1-monitor controller

(CJ) Doctor 2-scanner

(Luna) Doctor 3-panel controller

(Fred) Saxophone Robot

Script

*The doctors at their positions and the saxophone robot heading down*

*Patient 1 walks into the room* 

Doctor 1: “Good morning, how can we assist you today? How are you feeling?”

P1: “I’ve been having the worst week, I got an F on my assignment and I don’t know what to do. I can’t focus, and I can’t study through these tears!” *cries*

Doctor 1: “I see. No worries, we can fix that. Just step over here and we can see how to treat you” 

*directs person 1 to scanning area*

Doctor 2: “I’ll just scan you real quick, so we can better assess what you need.”

P1: “okay. . . “

*doctor 2 scans patient 1*

*everyone turns their heads to the tv monitor*

*doctor 1 flips the cover of the tv to reveal the poster*

Doctor 1: “Um. . .”

Doctor 3: “oops! Sorry, I was watching something. You can just switch the channel”

*Doctor 1 flips the card on the monitor to loading and then to first vitals card*

Doctor 2: “Hm, it does seem that your vitals are a bit unstable, the machine has determined that you are suffering from sadness.”

*Doctor 1 put the SAD brick on the TV monitor*

Doctor 1: “No worries, though, we can treat this easily! Come over here and have a seat.”

*Patient 1 sits down. Doctor 1 and 3 puts glasses and wristbands on Patient 1*

Doctor 3: “Now just listen, watch and relax.”

Patient 1: “Listen?”

*Doctor 3 shows the blue panel*

*The Saxophone Robot stands up straight and starts playing soothing music*

*Music stops and the doctors take the glasses and wristband off of patient 1.”

Doctor 2: “How are you feeling now?”

Patient 1: (Acting healed and curious) “This is amazing! I don’t feel like crying anymore! Thank you!” 

*Doctor 2 scans P1 again and Doctor 1 flipped the brick into peaceful*

Doctor 3: “It seems you are all fixed. Have a nice day!”

*P1 exits stage*

*P2 enters stage feeling stressed*

Doctor 1: “Good morning, how can we assist you today? How are you feeling?”

Patient 2: “I just came to visit a friend in this town, and she said you were able to fix the emotions. I‘m having 5 assignment dues today, and I’ve been so stressed lately. I was hoping that you could help me out. ”

Doctor 2: “Of course! Come stand over here, so we can scan you to better understand how your body is feeling.”

*Doctor 2 scans patient 2*

*everyone turns their heads to the tv monitor*

*Doctor 1 flips the card on the monitor to loading and then to next vitals card*

Doctor 3: “Hmmm… you seem very stressed. Come sit here and listen so you can relax.” 

Patient 2: “Listen? What is all this?? What are you putting on my wrists? These glasses are weird. I don’t want to wear them.”

Doctor 3: “Just calm down, so we can help you.”

*Robot stands up straight and starts playing as panel shows gray*

*Doctor 3 switches the gray panel into the red one in the middle of the process*

Patient 2: takes glasses off* “I don’t think this is working. I feel really strange. It’s like the images are making me feel worse”

Doctor 1: Maybe there was a mistake in your reading. It’s also weird that the panels switched automatically. Why don’t we scan you again?

*Doctor 2 scans patient again P2 is now angry – P2 looks at the monitor that says angry*

Doctor 2: See, that was the problem. It might have misread how you feel at the beginning. It seems you’re actually angry. Let’s try one more time. 

 They sit the patient down again and put the glasses and wristband on them 

*music starts playing normally but then starts sounding strange, notes are off*

Doctor 1: “hmmm…that’s never happened before”

*patient 2 angrily throws glasses and wristband” 

Patient 2: I knew this wasn’t going to work! I told my friend this was stupid. 

Doctor 3: don’t blame us. This machine fixes people every day, but you are the first that has caused this kind of reaction. 

*Patient 2 runs out and meets P1*

P1: What’s wrong? Why do you look so angry? Didn’t you go to the healing machine?

P2: I did. But I don’t know what’s wrong! I even feel worse after that. That machine totally makes no sense. 

P1: What do you mean?

P2: It just messes your emotions up. I don’t think it’s good to forcefully change people’s minds like that.”

P1: I don’t know, I like it. I already have an appointment with them for next week!

The End.

Some pictures and video

The video:

https://drive.google.com/file/d/1YQvceFG5Q1KHpETmeMzNksjV19SmBIB7/view?usp=sharing

Hello world!

Welcome to Web Publishing @ NYU. This is your first post. Edit or delete it, then start creating your site!

Online help is available via the Web Publishing Knowledge Site (wp.nyu.edu/knowledge) and the ServiceLink knowledge base (www.nyu.edu/servicelink). Through ServiceLink, you can find step-by-step instructions, as well as tutorials.

Digital Accessibility

As content creators who create and publish text, images, video, and audio, you must adhere to the NYU Website Accessibility Policy (https://www.nyu.edu/digitalaccessibility/policy) when creating and publishing digital content.

Web Publishing-specific Digital Accessibility Best Practices and examples of how to ensure your content are compliant are available at https://wp.nyu.edu/digitalaccessibility

If you have additional questions, contact the IT Service Desk for assistance. Support is available 24/7/365. For more details, visit www.nyu.edu/it/servicedesk.