Final Post by Kaycee

Project Title: You Are the Superhero!

Partner: Eugene

Instructor: Young

At first, we set all three heroes controlled by buttons. However, after the user test, we found that the game was too unfair for the villain to win because the user who controls the superheroes can simply press all three buttons and Thanos would have little chance to escape. Therefore, we replaced the middle button with a voice sensor. When the db it detected exceeds 95, the middle hero would shoot.  Because the voice sensor was quite small and user had to bend over to reach it, I put the voice sensor in a prop microphone. For the prop microphone, I used plastic because this material could be easily to buckle and tailor. A metallic microphone or a 3D print microphone were also two options for us. However, after screening, we still thought that plastic microphone was the best option because the voice sensor had to be sticked on the top inside of the microphone to better capture the voice from the user. And both metallic microphone and 3D print microphone were too hard for the scissors to cut to put the voice sensor inside. Another adjustment we made according to the user test was changing the size of the buttons.at first, we used the small buttons in the kit and they were hard to reach on the breadboard because of the dense jumper cables. Therefore, we borrowed two white buttons, whose radius was approximately 2 centimeters to better tailor to the hand size of the users.

The most significant steps in our production process was coding. We met problems mainly in communication between me and my partner. Because coding was a relatively subjective process and needed to continuously referring back to the previous part. Each one of us had a coding system and different ways to set variables in our mind and let my partner to understand the part of the code the you wrote was actually difficult and time-consuming. We also met difficulties in the process of producing the device. We planned to make two boxes, one was for the user to control the villain and the other was for another user to control the three superheroes. Because joystick was everything that user needed to move the villain, so I made a small cubes with side length of 10 centimeters, using the laser cut. I left a circle on the top of the box to let the joystick went through and a square on one side of the box to let jumper cables went through. However, in the user test, we found the problem that the jumper cable was not stable if only use the double-sided tape to stick it on the top from the inside. If the user pressed hard, the joystick would fall. So I cut some small square of cardboard and put them inside the box to support the joystick.

The main goal of our project is to provide a relaxes for the lovers of this series in their leisure time. Our intended audience is the young generation who favors the Marvel series movies and also game lovers. And this sort of shooting game exercise their responsiveness. In my opinion, the project aligns with my definition of interaction, which is two or more objects listen, understand and respond to each other. The project has all these three steps. When the user pushes the button, the computer analyzes the circuit and the code to run the correct animation and produce the effect. However, it is not only a one-way interaction. The user also has to observe the position of the villain so as to hit it precisely and defeat him as soon as possible. When it comes to what doesn’t align with my definition of interaction, I think the way to interact in this project is not plural enough, with only two ways which are pressing and shouting. It also doesn’t embody my opinion of a strong interactive project, which requires real-time interaction. If I had more time, I would add a face detect function in the end of the game. If the superheroes win, the camera will capture the user’s face and put it on the top of a costume of one of the three superheroes. And if the Thanos wins, the user will also “wear” like Thanos. This function will make the interaction in my project more real-time. From the process of producing my final project, I gained a lot, not only in the academic aspect,, but also the way to corporate with a partner. My awareness of the importance of a rational division of labor was developed and this can benefit me in my future way of academic learning.

Recitation 10: Media Controller by Kaycee

For this recitation, I intended to utilize the camera and pixel function that I learned in class and the infrared distance sensor to make a device, which can change the resolution of the camera by changing the distance that the infrared sensor senses.

I met some challenges in coding this program. The range of the infrared distance sensor that I used was 20 cm to 150 cm. In the first trial, I set the beginning number of the mapped range as 0 and it turned out that there was an error in the code. When I pressed the run button, only black canvas showed on my screen. Therefore, I changed the beginning range from 0 to 1 and this made the camera interface appeared on the canvas. The interface of the camera appeared unclear at first, after strengthening each connecting part,  the device finally produced the effect I wanted. When the user came close to the infrared sensor, the ellipse on the interface would become smaller, producing the effect that the camera became clearer. And when the distance that the sensor got became larger, the radius of the ellipse would increase which made the camera vague again.

After finish reading “Computer Vision for Artist and Designers”, I contemplated more about the “algorithm communication”. This is exactly how those brilliant works that made by computers come from. I also used this technology in my project. In fact, the code that I have written until now can all be concluded as algorithm communication, since they all rely on simple codes to produce complicated effects.

Code:

Processing

// IMA NYU Shanghai
// Interaction Lab
// This code receives one value from Arduino to Processing

import processing.serial.*;
import processing.video.*;
Capture cam;
int Size = 10;

Serial myPort;
int valueFromArduino;
int PORT_INDEX=5;

void setup() {
size(640, 480);
cam = new Capture (this, 640, 480);
cam.start();

printArray(Serial.list());
// this prints out the list of all available serial ports on your computer.

myPort = new Serial(this, Serial.list()[ PORT_INDEX ], 9600);
// WARNING!
// You will definitely get an error here.
// Change the PORT_INDEX to 0 and try running it again.
// And then, check the list of the ports,
// find the port “/dev/cu.usbmodem—-” or “/dev/tty.usbmodem—-”
// and replace PORT_INDEX above with the index number of the port.
}

void draw() {
if (cam.available()) {
cam.read();
cam.loadPixels();
}
while ( myPort.available() > 0) {
valueFromArduino = myPort.read();
}
int a = int(map (valueFromArduino,200,1500,1,10));
for (int y=0; y<cam.height; y+=a) {
for (int x=0; x<cam.width; x+=a) {
int i= y*cam.width+x;
fill( cam.pixels[i] );
noStroke();
ellipse(x, y, a, a);
}
}
//image(cam,0,0,width,height);
cam.updatePixels();

// to read the value from the Arduino

println(valueFromArduino);//This prints out the values from Arduino
}

Arduino

// IMA NYU Shanghai
// Interaction Lab
// This code sends one value from Arduino to Processing

void setup() {
Serial.begin(9600);
pinMode(pin, INPUT);
}

void loop() {
Serial.write(sensorValue);
uint16_t sensorValue = analogRead (pin);
double distance = get_IR (sensorValue);
Serial.println (sensorValue);
Serial.println (” cm”);
Serial.println ();
delay (500);
delay(10);
double get_IR (uint16_t sensorValue) {
if (sensorValue < 16) sensorValue = 16;
return 2076.0 / (sensorValue – 11.0);
}

Recitation 11: Workshops on Object Oriented Programming by Kaycee

For this recitation, I chose Object-Oriented Programming workshop because I will do a game as my final project and this workshop could benefit me the most in my opinion.

Based on what I learned in the class on Thursday, I made some improvements and interaction function to the code.

I changed the interaction form with the mouse from producing more balls to make the balls disappear. Since two kinds of interaction were impossible to perform at the same time, so I deleted the original code and set the number of the balls in the setup loop as 20. In my first trial, I directly put an “if” function, saying that the ball which the mouse clicked on would disappear. However, my alternation didn’t work since the program couldn’t recognize the radius of each different ball. And I realized that to eventually achieve my purpose, another attribute was also needed in the Balls function to represent its radius. And when the position where the mouse clicks falls in the area that is within the radius of that circle, the circle would disappear.

I added the interaction with the keyboard in order to make it more interactive. Since I kept the part of the code of bounce, the balls would still move in the adverse direction when touching the rim of the canvas. Addition to that, users could also control the movement of the balls by clicking on the direction keys. However, when I pressed one of the direction keys, although all the balls will generally move in the intended direction, some balls will ignore the bounce code and disappear at the rim of the canvas. And I wondered why this would happen.

Code:

ArrayList<Ball> balls = new ArrayList<Ball>();

void setup() {
size(500, 600);
noStroke();
balls = new ArrayList<Ball>();
for (int i=0; i<20; i++) {
balls.add(new Ball(width/2, height/2, random(30,75)));
}

}
void draw() {
background(255);
for (int i=0; i<balls.size(); i++) {
Ball b = balls.get(i);
b.move();
b.bounce();
b.display();
if (key==CODED) {
if (keyCode==UP) {
b.y–;
}
if (keyCode==DOWN) {
b.y++;
}
if (keyCode==LEFT) {
b.x–;
}
if (keyCode==DOWN) {
b.x++;
}
}
}
if (mousePressed) {
for (int i=0; i<balls.size(); i++) {
Ball b = balls.get(i);
float diameter=b.w;
float dx = abs(mouseX-b.x);
float dy = abs(mouseX-b.y);
float distance= sqrt(sq(dx)+sq(dy));
if (distance<= diameter) {
balls.remove(i);
}
}
}
}

ArrayList<Ball> balls = new ArrayList<Ball>();

void setup() {
size(500, 600);
noStroke();
balls = new ArrayList<Ball>();
for (int i=0; i<20; i++) {
balls.add(new Ball(width/2, height/2, random(30,75)));
}

}
void draw() {
background(255);
for (int i=0; i<balls.size(); i++) {
Ball b = balls.get(i);
b.move();
b.bounce();
b.display();
if (key==CODED) {
if (keyCode==UP) {
b.y–;
}
if (keyCode==DOWN) {
b.y++;
}
if (keyCode==LEFT) {
b.x–;
}
if (keyCode==DOWN) {
b.x++;
}
}
}
if (mousePressed) {
for (int i=0; i<balls.size(); i++) {
Ball b = balls.get(i);
float diameter=b.w;
float dx = abs(mouseX-b.x);
float dy = abs(mouseX-b.y);
float distance= sqrt(sq(dx)+sq(dy));
if (distance<= diameter) {
balls.remove(i);
}
}
}
}

Reflection 9: Final Project Process by Kaycee

In this recitation, we as a group of four looked into each other’s final project essay and helped each other to improve their ideas based on their initial state of the scheme.

Ellie intended to do a game called “Navigating in Space”. It was basically a first-perspective game in which the user can control the direction of the spaceship by moving the joystick to avoid the meteorites and space junks which floated in the universe. Ellie already had a background and the model image of the meteorites. She would mainly create the feeling of navigating by moving the background image. The joystick would actually control those various stones rather than the spaceship by writing coding the direction line reversely. During our discussion, we reached the agreement that the initial idea of this project was intriguing, but the playability of this game still had much space to improve. One piece of advice that we gave was to add a shooting function into the spaceship and make the whole game more involving. Actually, it could be more than a game. Since Ellie already designed the existence of space junks in the project, the underlying message could be conveyed easily. She could add more educational meaning into this project, for instance, setting some planets on the way of the navigation and each time a planet appears, pause the game and show some brief introduction of this planet. By increasing the educational meaning and playability of this project, Ellie could reach her initial goal as making more people interested in space exploration and be aware of the hazard of space junks more easily.

Mimi’s project had the most practical meaning among us. She wanted to create a maze in which the user could only figure out which way should they go by hearing the sound of the wind. The user would have to use earphones to play and by distinguishing the left or right channels does the wind blows, he or she chooses her way out. This game itself interested me a lot because usually, our projects would focus on the vision part rather than other feelings. Mimi’s project crippled our reliability, but transfer our attention to hearings. By establishing this project, she wanted to remind people to be aware of the importance of other feelings except for vision and also gave us an experience of the blind. She could improve her project by adding the punishment part. For example, if the user chooses the wrong way, there will be some warning sound and if the user doesn’t correct his or her choice, there will be a monster waiting for them at the end of the road.

Skyler also intended to create a game. She already had the demo for group members to play with. There would be a young snake at the beginning and its goal is to eat as many apples as it can. Users use the direction keys to control the movement of the snake and each time an apple is eaten, the snake will be longer to increase the difficulty of this game. If the snake touches the rim of the canvas or itself, the game will end and the score will be shown on the screen. It was basically a Gluttonous Snake game. What was different between her project and the traditional games was that the direction keys would lead the snake onto a reverse direction, which further increases the playability of this game. We all agreed that the game was great enough and only gave advises on beautifying the game.

Our definition of interaction was quite similar. What I still would like to mention is that Mimi’s project complete my definition at some degree. Heretofore, I focused on the interaction between different objects by vision but didn’t expect that any feeling like touching and hearing can also form a complete interaction.

I also received much advice for my final project. Aimed at my project, my group members’ opinions all focused on increasing its playability. For example, the game could be made into a multiple users game, in which another user can control the villain to fight back against the heroes. Also, adding some combination between different heroes instead of simply matching each button to each character will also make the game more interesting. I will selectively integrate these advice into my project to improve it.

In conclusion, this reflection was quite helpful for each of us to get new ideas on how we can improve our projects. Gathering each member’s perspective efficiently reveals the weaknesses of our projects.

Final Project Essay by Kaycee

Project name: You are the SUPERHERO

Project partner: Eugene

In the stage of finding inspiration, we searched online for creative projects, considering serial communication between processing and Arduino. Our initial idea of the scheme is to use sensors to increase the interaction in the project. However, the game “Flappy Bird” rendered us a new thought to the project. The basic method to play with the flappy bird is to tap on the phone screen and each tap will generate a rise of the height that the bird flies at. The game set infinite bars with different width and different open range for the bird to go through. We decided that we’re more interested in bringing out such a simple simulation game in Processing, in which the users can control the movement of elements of the animation on the screen to produce the sense of interaction. When contemplating our project further, we chose “Avengers” as the topic since as a simulation game, it should possess enough playability for the young generation and cater to them.  What’s more, “The Avengers” is also a hot topic recently, therefore we decided to employ the format of wars between the Avengers and Thanos. We expect the main challenge that we will confront with is writing the code. Other than this, we may also counter with tasks like building the circuit.

Our present ideation of the final project engages four characters in the related movie, three heroes and a villain. We’ll first input the images of these four characters as well as their weapons to let the users feel familiar with. The villain will move regularly on the right side of the canvas with the initial full HP. Three heroes will be distributed at the top, middle and the bottom of the left side of the canvas. Each of them is connected to a button on the breadboard. When the user pushes the particular button, the hero will use his matching weapon to try to hit the villain. And if he succeeds, the HP of the villain will drop by a certain degree. If the HP drops below 0, the user wins the game and there will be a costume of one of the heroes in the center of the screen, at the same time, the camera will detect the face of the user and project it to the canvas to fit the costume. To increase the playability of this game, we intend to make the HP of the villain a slowly growing index, so that it will not be easily defeated. All the coding part should be approximately finished 3 days before the presentation days so that we can make an adjustment if there is any problem. The rest of the part including building the circuit should be finished with these three days.

Games like “Flappy Bird” and the recent Marvel movie series gave us the inspiration to build a simple simulation game. In my opinion, the project aligns with my definition of interaction, which is two or more objects listen, understand and respond to each other. The project has all these three steps. When the user pushes the button, the computer analyzes the circuit and the code to run the correct animation and produce the effect. Detecting face to match it with the costume is also a form of interaction engage in the project. However, it is not only a one-way interaction. The user also has to observe the position of the villain so as to hit it precisely and defeat him as soon as possible. What is unique about our project is that though it is a normal form of simulation games, the topic is creative and the final animation of the costume along with the face detected increases the playability of this game. Our intended audience is the young generation who favors the Marvel series movies and also game lovers. We anticipate its intended impact as a relaxes for the lovers of this series in their leisure time.