Final Project Essay(Alex Wang)

Alex Wang

Eric Parren

Interaction Lab

25 April 2019

Final Essay

My definition of interaction have been constantly developing and evolving ever since I have started taking this course. As mentioned in my Final preparatory research and analysis, My understanding of interaction have three major changes. At first I understood the concept of Interaction as the cycle of giving and receiving information, then through reading the course readings and other content,I realized that it should more accurately be described as the process of giving a certain input and getting a certain feedback according to the input. A second great change to my idea of interaction happened during the production of my midterm project, I decided that a good interaction should be natural to the humans body. That the ways we use for communicating between humans should be the same mechanics that are used to communicate with machines, so that it feels natural. I build my first midterm prototype with that in mind and I can up with a treasure lock that unlocks with hand gestures, which I believe is more natural and comfortable to the user than a traditional combination or password lock. But during the user testing I made a big realization when the users are confused with my project, not knowing what to do with it without the guidance from the designer. This is the third major change to my understanding of interaction that I had made. I learned that not only should the medium through achieving this interaction be natural to the user, but every aspect of the design should be friendly and act as a guidance to indicating the functionalities and expected feedback of the machine, and the machine should give the feedback that the user expects. Just as Don Norman discussed in his research on door design.

With this refined definition of interaction in mind, I want to create some sort of game for my final project. I am currently thinking of creating an archery game with bows and arrows. By combining the Arduino hardware with the Processing software, I would like to have the user feel like they are really doing archery even when everything is digitally programmed and only virtual. A similar kind of interaction exists in virtual reality games, where the user is made to believe that they are actually shooting, as they are aiming and pulling on the bow physically and also receiving graphical information that matches their movement. I really appreciate this kind of interaction as it lines up with my personal values on user experience and the natural feeling to interaction, and I would like to try and replicate this experience with the use of the limited resources that is available to me, since the hardware and software used in a VR console is pretty cutting-edge.

Recitation 8: Serial Communication (Alex Wang)

For this weeks recitation, we now merge both the Arduino and the Processing language to create a physical project with the capabilities to run code when connected to a computer. In order to connect the two, we communicate through the serial port, by writing and reading information on both ends.

Exercise1:

Our first exercise is to create a etch a sketch using potentiometers as control and feeding that data to processing where the image is drawn. At first I used the ellipse function to simulate the drawing, but the ellipse keeps jumping if the potentiometer value changes too quickly, I tried messing with the delay value but it does not seem to be smooth enough.

I then decided to use the line function instead to make sure the drawing is connected, but the parameter for the line function requires two sets of coordinates, I took care of this by storing the previous value inside a list before it refreshes and use that as the parameter for the line function and it looks much smoother.

Exercise2:

Our second task was to create an instrument with processing input and Arduino buzzer as output.

For minimal coding, I decided to send the key value to Arduino when the key is pressed, I modify the key value by subtracting it by 96 (ascii value for char ‘a’ is 97) and then multiplying that value by a certain frequency for each character on the keyboard to be a different tone. I set the default duration of the tone to 10, so that it gets a plucky sound.

Both exercises schematics below:

Conclusion:

I think the combination of Arduino and processing gives me so many more possibilities to what I can achieve, the transmission of data between the two programs are made much friendlier with the provided code and enables me to create projects that have similar structures to the two exercises we did in class. Either having Arduino sensors for input and processing visuals as output, or have processing mouse and keyboard as input, and have Arduino leds and buzzers as output. This gives me a lot more option to what I can achieve with these tools and also a lot of inspiration for my final project.

Final Preparatory Research and Analysis(Alex Wang)

through out the semester, I worked on many projects and was constantly thinking about the meaning to interaction. At first I understand interaction simply as the process of performing and action and getting some form of feedback in response. But after my midterm projects process of prototyping and user testing, I realized that even though interaction is the cycle of performing an action and getting feedback, it was not enough. As a designer you have to distinguish between good and bad interaction, this is when I decided that a good interaction requires the design to take a form natural to the human instinct and should guide the user on to the track of the interaction that you intended them to. This evolution of understanding was crucial to shaping my approach to these kind of projects and will continue to perfect my effort of creating user friendly interaction.

Currently I am planning on creating a interactive game for my final project, so I looked into previous game consoles as my inspiration. One that does not align with my definition of interaction would be the Sony playstation. Even though it is a very popular and successful game console, yet it fails to meet up with my standards of a interactive gaming experience. As its controller is using the joystick+button set up. I believe that commonly used controllers like the joystick, buttons, and even mouse and keyboard are very good for accurately transmitting the users desired controls into the machine to achieve a stable communication between the human and the code. But as accurate as it can be, it is not a natural way humans interact with objects, no one was born with the ability to use the mouse and keyboard. This in turn affects the overall interaction experience of the project, the user will always be in the mind set of playing a game rather than reacting to the visuals on the screen naturally out of instinct.

HTC vive, on the other hand, provides a perfectly real and natural experience to the user, the controls are perfectly natural to the user, as it is using sensors to pick up the users hand movement and body movement. The feedback of the machine also contributes to the feeling, as the accuracy of the graphics matches the users, both in terms of frame rate and perspective, the outcome is a experience so real that the user will subconsciously believe what is happening in game.

After examining specific examples of interactions that both lines up, and not line up with my understanding of interaction, I came to the conclusion that my definition of a good interaction is the cycle of giving and receiving information that are both natural to the humans instinct, and accurately matches the expectations of the user. If the feedback is even slightly off from the users expectations, the user will feel uncomfortable, an example would be the lag of frame rates will cause discomfort. Or if the controls are not precise enough, and the code is not giving me the feedback I expected from my input, it will give the user a headache. This is why the difference between expected feedback and actual feedback should be minimal in order to secure a smooth and comfortable interaction.

Recitation 7: Processing Animation(Alex Wang)

This weeks recitation we expand on what we worked last week, and animate the graphics in processing using loops. I decided to work off of the example my instructor showed me in our lecture, a simple function that draws a stick man.

I started off animating the stick man by making it follow the mouse, at first I only made it follow the x position, but later also included y position.

I tried to give it more interactivity, and make something happen when the mouse is pressed. I tried messing with the angle of the function as it was one of the original parameters that my instructor chose. But I didnt really like this interaction so I started trying other stuff.

I then gave the stick man flashing colors and also leaving a trail behind him as the mouse is clicked, this lead me to my next idea of leaving a frame of its previous position as if it is an afterimage or blur in typical animation technique.

I changed the code and declared a counter variable so that it does not clears the shadows after a certain frame count.

I also tried a alternative technique of storing the positions as a variable, making it possible to only have one shadow.

Recitation Homework:

I started off only drawing one ellipse. The ellipse function is very useful as it takes care of all the math for me, I don’t have to write a lot of math formulas to draw a circle.

I later realized that the circle on the example gif seemed a lot thicker than my circle, so I created two circles, one filled and one white to create the inner hole. I also made the color of the circle change depending on the same variable that controls size.

I made the circle moveable with the keyboard inputs from the user. 

I also set boundaries so that the circle does not leave the page.

This recitation was pretty useful as it got me more familiar with processing syntax, and prepares me well to tackle more ambitious projects with the use of processing. Also giving me a better understanding of how the frames can be modified by code to produce animations with simple programming logic.

Code below:

circle:

int size;
int velocity = 2;
int xpos= width/2;
int ypos= height/2;
int xv= 5;
int yv = 5;
void setup(){
size(600,600);
size=60;
xpos= width/2;
ypos= height/2;
}
void draw(){
background(width/2);

if (size >=width/2){
velocity *= -1;
}
if (size <= 50){
velocity *= -1;
}
size+=velocity;
colorMode(HSB, width/2);
fill(size,size+100,250);
ellipse(xpos,ypos,size,size);
fill(width/2);
ellipse(xpos,ypos,size-40,size-40);
if (keyPressed){
if (key == CODED) {
if (keyCode == UP & ypos>=0) {
//fillVal = 255;
ypos += -yv;
} else if (keyCode == DOWN & ypos<= height) {
//fillVal = 0;
ypos += yv;
}
else if (keyCode == RIGHT & xpos<= width){
xpos += xv;
}
else if (keyCode == LEFT & xpos>=0){
xpos += -xv;
}

}
}
}

Stick man:

int xpos;
int ypos;
int curx;
int cury;
float angle;
boolean colors = false;
int counter = 0;
void setup(){
size(600,600);
background(255);
//stickman(100,100,0);
xpos = 0;
ypos = 0;

}
void draw(){

if (mouseX >= xpos){
xpos+=1;
}
else if(mouseX <= xpos){
xpos-=1;
}
if (mouseY >= ypos){
ypos+=1;
}
else if(mouseY <= ypos){
ypos-=1;
}
if (mousePressed){
//angle+=0.1;
//colors = true;
if (counter >= 30){
counter=0;
//background(255);
curx = xpos;
cury = ypos;

stickman(curx,cury,0,false);
}
else{
counter+=1;
background(255);
stickman(curx,cury,0,false);
}
}
else{
background(255);
colors = false;
}
stickman(xpos,ypos,angle,colors);
//int randx = int(random(width));
//int randy = int(random(height));
//stickman(randx,randy,0.5);
}

void stickman(int x,int y, float angle,boolean colors){
pushMatrix();
translate(x-40,y-40);
rotate(angle);
//rect(20,20,40,160);
ellipse(40,40,30,30);
line(40,55,40,150);
line(40,55,20,100);
line(40,55,60,100);
line(40,150,20,200);
line(40,150,60,200);
if (colors){
float r = random(255);
float g = random(255);
float b = random(255);
fill(r,g,b);
}
else{
fill(255);
}
popMatrix();

}

Midterm Project: Treasure Chest Alex Wang/Henry Shaffer Eric Parren

Midterm Reflection

For my Interaction lab midterm project, me and my partner Henry decided to create a “Treasure Chest” that can only be unlocked through certain interactions by the user. We came up with this original idea ourselfs, but during presentation we were told that the users have seen similar styles of projects in real life room escape games, which is used for entertainment. I can also see it being mass produced as a toy targeted at children or a part of a bigger project such as room escape games, as it has a lot of interactions and lights that can amuse the user. What makes this project different from others is that it is in the form of a puzzle, so it requires the user to explore its functionalities rather have someone explain it. Which makes it challenging to design, as it is hard to have both a challenging puzzle and a self explanatory user interaction at the same time.

The final product was heavily influenced by the things I have learned during the group research project, where I looked into the pom pom mirror and I really liked how it gives the user a instant visual feedback which in turn hints the user of its function and leads them on to the right track of further interactions with the project. This shaped my understanding to interaction, which is the process of performing an action and getting some form of feedback in response. I also believe that a good interaction would be something that feels natural to the human body and mind.

With this definition of good interaction in mind, we started to design our first prototype. The first idea of the project came from Henry, where he wanted a box that will play rhythms with a buzzer and have the user clap the same rhythm in order for it to open. I wanted to include more interactions and led lights for the project to be more visually appealing to the user, so I took the fundamental idea of having special interactions to unlock a box instead of a traditional combinations lock and I changed the interaction to hand positions. We prototyped the first version on a breadboard and then rewired into a cardboard box to see what size would be suitable for our project.

We decided to use two infrared to trigger the lock while the correct combination of hand height was achieved, we used led as a indication that the user is making progress, when all four leds light up, the chest is then unlocked with the control of a simple servo lock.

Once everything was working we started fabricating the final shell for our project. We downloaded open source stl files and made some models of our own using 3d modeling softwares like blender and tinkercad, and was planning on 3d printing everything. We then learned that 3d printing big parts was very inefficient and would take up to days to print along with the risk of the print failing, so we changed plans and only printed small parts, while everything big such as the chest and the small towers behind it was laser cut with wood.

During user testing we learned a lot of valuable things, and started implementing more features that makes the project easier for the user to understand, yet keeping it a somewhat challenging puzzle. Originally we used a number system to identify the height of the hand as it resembles the number system of a combination lock, but after user testing we decided to make it more user friendly by color coding everything inside the project, and visualize the colors through the use of neopixels, and match it with the movement of the hand. We also laser cut hand shaped containers for the infrared so that the user knows that the interaction has to do with hands without explicitly writing instructions.

Another improvement after user testing was adding sounds to the project. Some testers missed the indication of progress of LED lighting up as they are too focused on the position of their hands, so we moved the position of LEDs higher and also made a buzzer beep at the same time to make it more obvious. I also coded a intro for the project so that the first time it powers up, it will light up the LEDs in sequence to hint the user that they could be lit up, as they are hidden behind transparent acrylic.

I also did a lot of improvements coding wise, such as making the process of achieving progress longer, so that someone have to hold their hands in place for a couple seconds as opposed to unlocking by randomly flapping their hands. I also wrote the code so that your progress to unlocking does not completely reset if you accidentally move your hands slightly below or above the designated color, you will now decrease the variable count to unlocking instead. This made the project easy to unlock if you figured out the password, yet impossible to unlock for someone who do not know the password.

The creation of the project was pretty difficult, the infrared behaves strangely at certain distances and numbers messes up the code, but I was able to debug with the help of printing everything out in serial. One major mistake that we made though was using the 5v from the Arduino to power up everything, which was not enough for power hungry components such as the neopixels and the servo, and we had to use a second Arduino. But it did not work, I spent a large chunk of my time trying to debug my code and later found out that circuits have to go to common ground, which is a basic concept for engineering but new to me as a programmer.

Overall I think the project was fairly successful. I created something that aligns with my definition of interaction, something that gives clear feedback to the user through natural interactions, in this case hand gestures and the feedback of neopixels following the hand. Though it is still slightly challenging for the user to figure out how to unlock the box, but most people was able to figure out the puzzle on themselves, and the challenging aspect just adds to the joy of being able to unlock the box. Which is my initial goal of the project, to entertain the user. If I had more time for the project, I will probably paint the project to fit the Egyptian theme we gave it, and hide the servo so that it opens up the chest while it is not visible to the user. I think this project demonstrates my understanding to interaction, and it shows that through the process of testing and innovating over and over, the design of a interactive entity can then be polished into something friendly to the user even if its purpose was to lock things up, and prevent the user from opening it. 

Code and citation:

#include <Adafruit_NeoPixel.h>
#include <Servo.h>
//Servo servo1;
Servo servo2;
int progress;
int counter;
int right;
int left;
int yellow = 7;
int red = 8;
int green = 9;
//int neo = 13;
int blue = 10;
int reset = 6;
int buzzer = 3;
#define PIN 13
#define PIN2 2
Adafruit_NeoPixel strip2 = Adafruit_NeoPixel(20, PIN2);
Adafruit_NeoPixel strip = Adafruit_NeoPixel(20, PIN);
void setup() {
// put your setup code here, to run once:

strip.begin();
strip.show();
strip2.begin();
strip2.show();

//servo1.attach(11);
pinMode(reset, INPUT);
//servo2.attach(11);
//servo2.write(36);
//delay(1000);
//servo2.write(120);
//delay(1000);
//servo2.write(120);
//delay(200);
//servo2.write(36);
//servo1.write(120);
Serial.begin(9600);
progress = 0;
counter = 0;
pinMode(buzzer, OUTPUT);
pinMode(yellow, OUTPUT);
pinMode(green, OUTPUT);
pinMode(red, OUTPUT);
pinMode(blue, OUTPUT);

digitalWrite(yellow, HIGH);
tone(buzzer, 200, 200);
delay(1000);
digitalWrite(red, HIGH);
tone(buzzer, 350, 200);
delay(1000);
digitalWrite(green, HIGH);
tone(buzzer, 600, 200);
delay(1000);
digitalWrite(blue, HIGH);
tone(buzzer, 800, 200);
delay(1000);

//error sound
tone(buzzer, 200, 100);
delay(100);
tone(buzzer, 100, 100);
}

void loop() {
//Serial.println(digitalRead(reset));
int ln = map(left, 150, 700, 20, -2) ;
int rn = map(right, 150, 700, 20, -2);

//clear color
for (int i = 0; i < 30; i++) {
strip.setPixelColor(i, 0, 0, 0);
strip2.setPixelColor(i, 0, 0, 0);
}
//color left neopixels
for (int i = 0; i < ln; i++) {
if (ln > 19) {
strip2.setPixelColor(i, 0, 0, 0);
}
else if (ln > 15) {
strip2.setPixelColor(i, 255, 0, 0);
}
else if (ln > 10) {
strip2.setPixelColor(i, 0, 255, 0);
}
else if (ln > 5) {
strip2.setPixelColor(i, 0, 0, 255);
}
else {
strip2.setPixelColor(i, 255, 255, 0);
}

}
//color right neopixels
for (int i = 0; i < rn; i++) {
if (rn > 19) {
strip.setPixelColor(i, 0, 0, 0);
}
else if (rn > 15) {
strip.setPixelColor(i, 255, 0, 0);
}
else if (rn > 10) {
strip.setPixelColor(i, 0, 255, 0);
}
else if (rn > 5) {
strip.setPixelColor(i, 0, 0, 255);
}
else {
strip.setPixelColor(i, 255, 255, 0);
}
}

strip.show();
strip2.show();
if (digitalRead(reset) == HIGH) {
digitalWrite(yellow, HIGH);
digitalWrite(red, HIGH);
digitalWrite(green, HIGH);
digitalWrite(blue, HIGH);
delay(50);
digitalWrite(yellow, LOW);
digitalWrite(red, LOW);
digitalWrite(green, LOW);
digitalWrite(blue, LOW);
progress = 0;
}

//Serial.println(rn);
//Serial.println(right);
//Serial.println(left);
//Serial.println(progress);

right = analogRead(A0);
left = analogRead(A1);

// put your main code here, to run repeatedly:
if (progress == 4) {
digitalWrite(blue, HIGH);
delay(100);
digitalWrite(yellow, LOW);
digitalWrite(red, LOW);
digitalWrite(green, LOW);
digitalWrite(blue, LOW);
delay(50);
digitalWrite(yellow, HIGH);
digitalWrite(red, HIGH);
digitalWrite(green, HIGH);
digitalWrite(blue, HIGH);
delay(50);
digitalWrite(yellow, LOW);
digitalWrite(red, LOW);
digitalWrite(green, LOW);
digitalWrite(blue, LOW);
progress = 0;
delay(1000);
tone(buzzer, 800, 200);
delay(200);
tone(buzzer, 800, 200);
delay(200);
tone(buzzer, 800, 200);
delay(200);
tone(buzzer, 1200, 600);
delay(500);

//servo1.write(0);
//open box
digitalWrite(11,HIGH);
delay(500);
digitalWrite(11,LOW);
//servo2.write(120);
//delay(200);
//servo2.write(36);
}
else if (progress == 0) {

digitalWrite(yellow, LOW);
digitalWrite(red, LOW);
digitalWrite(green, LOW);
digitalWrite(blue, LOW);
if (rn > 15 and rn < 19 and ln > 10 and ln < 15) {
//red and green
counter += 2;

}
else if (counter > 0) {
counter -= 2;
}
if (counter >= 400) {
tone(buzzer, 200, 200);
progress = 1;
counter = 0;
}
}
else if (progress == 1) {
//light led and buzz once

digitalWrite(yellow, HIGH);
//servo1.write(0);
//Serial.println(counter);
if (rn < 15) {
//checks for wrong direction movement
//progress = 0;
}

if (rn > 10 and rn < 15 and ln > 5 and ln < 10) {
// green and blue
counter += 2;
}
else if (counter > 0) {
counter -= 3;
}
if (counter >= 500) {
tone(buzzer, 350, 200);
progress = 2;
counter = 0;
}
}
else if (progress == 2) {
digitalWrite(red, HIGH);

//light more led and buzz higher once
if (right > 900 or left < 200) {
//checks for wrong direction movement
//progress = 0;
}
if (rn > 10 and rn < 15 and ln > 15 and ln < 19) {
counter += 2;
}
else if (counter > 0) {
counter -= 3;
}
if (counter >= 500) {
tone(buzzer, 600, 200);
progress = 3;
counter = 0;
}
}
else if (progress == 3) {
digitalWrite(green, HIGH);
//light more led and buzz higher once
if (right > 500 or left > 500) {
//checks for wrong direction movement
//progress = 0;
}
if (rn < 5 and ln < 5) {
counter += 2;
}
else if (counter > 0) {
counter -= 4;
}
if (counter >= 500) {
tone(buzzer, 800, 200);
delay(50);
progress = 4;
counter = 0;
}
}

}

3D models and laser cut files used:

Egyptian Artifact

Guard dogs

Treasure Chest