Preparatory Research and Analysis

Prior to taking this class, my definition of interaction was “any object that is able to engage the five senses with a human being or able to react to another object”. Looking back, this is a very shallow interpretation of interaction. I would definitely say that my definition of interaction has changed throughout this class. Bret Victor the author of “What Exactly is Interactivity” thought that “the term interactivity is overused and under understood”. After reading his article, and creating several recitation projects, Interaction is the series of movements by at least two actors, both of which actively responding to one another. I have created several projects in which there was almost a domino effect of actions resulting from one interaction. For example, a sensor had to pick up a reading in order to activate the servo motors which would activate the LEDs. This project amongst any others has changed the way I perceived what it meant to interact.

As for the final project, I have several ideas that I want to try out, however I am unsure if they will work out. Currently I am taking a Contemporary Dance class and it is difficult to quickly understand what the professor wants us to look like as a group. One of the ideas for my final project, is creating a device that will animate what the group dance would look like. There would be a box used as a platform for the dancers, and the dancers would have limbs that would be movable. The user would attach sensors all over their body so when the user jumps, the dancers on the platform would mimic the same movement. This way, the Professor is able to demonstrate what the overall group performance should look like with advanced movements across the stage. However, I am afraid that this might be too ambitious due to time constraints and lack of experience with the more difficult Arduino gadgets. My other idea would be to create a dinosaur from Jurassic park that would eat anything anything that came within its path. I wanted to continue using one sensor to trigger more functions. If I placed something in front of the ultrasonic sensors on the dinosaur, it would trigger the speakers to produce a “roar” sound. It would also trigger the servos in the arms of the dinosaur to pick up the victim with flashing red LEDs. I also want to implement some kind of screen to display words to the victims that say “RUN!” or “Oh no!” I believe this project would be more challenging because I will try to 3D print the dinosaur and implement skills from processing to create a display. Either way, I would like to stick with the idea of creating something that is entertaining or only slightly useful. An entertaining experience will make the user want to continue interacting with the device. I also feel a lot of pressure when I want to create something useful for those who are disadvantaged and it becomes more difficult to do so. Interactivity from a device that is meant to entertain motivates me to find other ways to make either object more reactive to the other.

Recitation 7: Processing Animation

At the beginning of this recitation, I was nervous because I have a hard time with coding and coordinates. I am more comfortable with Arduino because it is more physical. In this recitation, I learned more about design technique than I did about Processing Animation. Continuing the design from the last recitation, I wanted to finish up at least three boxes of the four. The top two boxes were created with the rect() function. For my third red box, I learned to use the triangle function. It was harder than I expected because I was not used to the coordinates. My original intention was for the user to change the colors of the lines by tapping a key. At this point, I still had one unfinished box to create, and the idea came that I should create a tool for the user to design it themselves with the mousePressed() function. I added this function under void draw().

Due to time constraints, I was not able to complete the additional recitation work.

Original Photo

Processing Animation

CODE

void setup()

{

 size (600, 600);

 background(255);

 strokeWeight(10);  // Thicker

 line(20, 40, 580, 40);

 strokeWeight(10);  // Thicker

 line(20, 40, 20, 580);

 strokeWeight(10);  // Thicker

 line(20, 580, 580, 580);

 strokeWeight(10);  // Thicker

 line(580, 40, 580, 580);

 //inner lines

 strokeWeight(10);  // horizontal

 line(20, 300, 580, 300);

 strokeWeight(10);  // vertical

 line(300, 40, 300, 580 );

 //box 1

 strokeWeight(10);  // horizontal

 line(75, 40, 75, 300);

 line(130, 40, 130, 300);

 line(185, 40, 185, 300);

 line(240, 40, 240, 300);

 //box 2

 strokeWeight(10);  // horizontal

 fill(255, 255, 0);

 rect (300, 40, 280, 40);

 //rect (300, 80, 280, 30);

 //line(300, 80, 580, 80);

 fill(255, 255, 0);

 rect(300, 110, 280, 30); // rect (x,y, width,height);

 //line(300, 110, 580, 110);

 //line(300, 140, 580, 140);

 fill (255, 255, 0);

 rect(300, 170, 280, 30);

 //line(300, 170, 580, 170);

 //line(300, 200, 580, 200);

 fill (255, 255, 0);

 rect(300, 230, 280, 30);

 line(300, 230, 580, 230);

 line(300, 260, 580, 260);

 //Box3

 //triangle(x1, y1, x2, y2, x3, y3)

 fill(255, 0, 0);

 triangle (20, 300, 20, 575, 305, 300);

 fill(250);

 triangle (20, 300, 20, 520, 240, 300);

 fill(255, 0, 0);

 triangle (20, 300, 20, 465, 185, 300);

 fill(250);

 triangle (20, 300, 20, 410, 130, 300);

 fill(255, 0, 0);

 triangle (20, 300, 20, 355, 75, 300);

 //——

 fill(255,0,0);

 triangle (75, 575, 300, 355, 300, 575);

 

 fill(250);

 triangle (140, 575, 300, 400, 300, 575);

 

 fill(255, 0, 0);

 triangle (205, 575, 300, 470, 300, 575);

}

void draw()

{

 wizardCoding();

 if (mousePressed)

 {

   stroke(0, 0, 255);

 } else

 {

   stroke(0);

 }

 line(mouseX-1, mouseY, mouseX+1, mouseY);

 //line(mouseX, mouseY-10, mouseX, mouseY+10);

}

void wizardCoding(){

println(mouseX, mouseY);

}

pesKey Bot- Key Reminder- Kathy Song- Rudi

pesKey Bot- Key Reminder- Kathy Song- Rudi

Partner: Tiana Lui

Context and Significance

Our previous research project was a tablet that would display Braille characters for young children to learn Braille through different textures. The purpose of that project was to create something that would increase interactivity between children and the Braille language. I believe that the best representation of interactivity is when two objects respond to each other. However, a project like that would be quite difficult to implement in real life. I did not want to create something that would be limited by time constraints and the lack of proper materials. My partner and I wanted to create something fun and simple with the goal that it would respond to us and we would respond to it as well. For the projects that I have researched, such as the expressive tactile controls, the buttons have personalities such as “timid”, “lazy”, “stubborn”. Each sensor would respond to the person in a different way. “Timid” would avoid the person’s touch, “lazy” operates much slower compared to the average button, and “stubborn” would offer resistance when pressed. I took an interest in how the buttons had personalities and more importantly, how they reacted the users. This has been one of the projects that have truly shaped my understanding of interaction as a series of movements by at least two actors, both responding to one another. Our project is unique because it has a personality and is entertaining while useful. The robot design with a face allows you to think of it as a friend instead of a machine and its arms are quite realistic. Our project is intended for those who forget their keys frequently but is also created for those who want to have fun intrigue those who are looking for something different.

Conception and Design

Our robot was created with the intention of reminding the user to bring their keys before leaving their home. Originally, we had placed the ultrasonic sensors at the side of the robot so that it would sense the person only when they were leaving the house. However, we received constructive feedback that it would not make sense to place the ultrasonic sensor on the side because, by the time the user has been notified to get their keys, they would already be out the door. I redesigned the robot so that it would sense the user while they were walking towards the door, this way there would be ample time for the user to remember to grab their keys. Furthermore, we wanted to keep the idea of an interactive robot that was more of a friend than a robot, so we gave it an angry face and hands that would rapidly wave if the keys were not taken. After the constructive feedback, we decided that it was best to use a light sensor because it would detect if the key was actually taken, instead of only depending on the ultrasonic sensor which would only detect if the person walked by or not. We also used laser cut acrylic instead of laser cut wood because we wanted the red LED to glow throughout the entire robot to demonstrate that it was angry. The criteria that we selected for our materials were done so to help humanize the robot and give it extra detailing. The materials that we used were best suited for our project because they were efficient and each part was able to play its role properly. Another option that would have better contributed to our project would be 3D printing instead of using laser cut acrylic. If we had 3D printed the box, then the LEDs would be able to illuminate throughout the whole face, whereas the laser cut acrylic did the bare minimum.

Prototype 1 (video size is too big, this was the only size that would convert)

Prototype 2/ User Testing 

Fabrication and Production

The most significant steps in our production process occurred during the User Testing Session. During our User Testing Session, we realized that we implemented the speaker function incorrectly due to wiring, which ended up frying the Arduino board. Furthermore, we realized that we had to implement a light sensor or else it would not be possible to detect if the key was taken. After the user testing process, we started rewiring and coding the new speakers and sensors. The light sensor works by detecting how much light the sensor receives. If there is less light, it meant the key is on the sensor, if there is more light then the key is off the sensor. We changed the speaker to a grove speaker to emit tones using sample code. If the ultrasonic sensor detects that the distance is less than 40cm, and if the key was present, then the speaker will ring, servo motor arms would go off, and the LED would blink, else nothing happens. This was a change from our previous design of just having ultrasonic sensors detecting the movement. At the beginning it was not effective because of minor coding issues, sometimes the sensors would not detect properly and other times the speakers or the LEDs would not go off. Ultimately, after a few minor changes, we decided to use two Arduino board and connecting the two together because the motors were not functioning properly otherwise. The production choices my partner and I made were based on how we would have sold this as a product. Each move was made based on how realistic we thought it would be to implement the new features.

Prototype 1

Prototype 2

Conclusions

The goals of our project were to create a product that would be user-friendly, entertaining, interactive, and useful. We wanted to create a robot that would remind the user to take their keys before they left the house, if they did not remember to take their keys, the robot would become angry and start making angry noises. Our project results align with the definition of interaction because there is a series of movements between at least two of the actors, with one responding to the other. However, it may not align with interaction because it is using technology to remind the user to complete a simple task which seems to remove the users’ interaction with his or her environment. Our audience was intrigued that our project had a personality because it is different from the traditional robot. During the user testing session, the audience wanted to engage with the robot but were unclear about what they were supposed to do. After explaining it to them, they were much more entertained by the robot. If we had more time, we would 3D print its entire structure rather than use laser cut acrylic for the LED to glow throughout the entire robot. We would also like to implement a speaker that would make the robot yell when it got angry instead of using a sample code of an alarm. We learned to be patient and double check every wire and any necessary resistors so the Arduino boards would not get fried again, we would have also avoided careless mistakes in the code. We felt so good after completing this project because it showed us how the intimate workings of the code worked and how every step is vital to the final project.

Code

//ultrasonic distance sensor

int trigger_pin = 3;

int echo_pin = 2;

long duration;

int distance;

//ldr

int ldrPin = A0;

//key not present value

int calibrationValue = 670;

//key on holder

int keyPresentValue = 270;

//current ldr value

int ldrValue = 0;

//grove speaker

#define SPEAKER 6

//int BassTab[] = {1911, 1702, 1516, 1431, 1275, 1136, 1012}; //bass 1~7

int BassTab[] = {1012, 1012, 1012, 1012, 1012, 1012, 1012}; //bass 1~7

//led

const int ledPin=7;

int ledState=LOW;

//store last time LED was blinking

unsigned long previousMillis=0;

//interval at which to blink

const long interval=10;

////////////////////////////////////////////////////////////////////////////////////////////

//FUNCTIONS

//distance

void measure() {

 //measure current position from sensor

 digitalWrite(trigger_pin, LOW);

 delayMicroseconds(2);

 digitalWrite (trigger_pin, HIGH);

 delayMicroseconds (10);

 digitalWrite (trigger_pin, LOW);

 duration = pulseIn (echo_pin, HIGH);

 //distance in cm

 distance = (duration * 0.034) / 2;

}

//grove speaker

void pinInit() {

 pinMode(SPEAKER, OUTPUT);

 digitalWrite(SPEAKER, LOW);

}

void sound(uint8_t note_index) {

 for (int i = 0; i < 100; i++) {

   digitalWrite(SPEAKER, HIGH);

   delayMicroseconds(BassTab[note_index]);

   digitalWrite(SPEAKER, LOW);

   delayMicroseconds(BassTab[note_index]);

 }

}

//////////////////////////////////////////////////////////////////////////////////////////

void setup() {

 // put your setup code here, to run once:

 Serial.begin(9600);

 //distance sensor

 pinMode (trigger_pin, OUTPUT);

 pinMode (echo_pin, INPUT);

 //ldr

 Serial.print(“Calibrating. Do not place your keys on sensor.”);

 delay(10000);

 calibrationValue = analogRead(ldrPin);

 Serial.println(“Calibration value: ” + String(calibrationValue));

 Serial.println(“Place your keys on the sensor.”);

 delay(5000);

 keyPresentValue = analogRead(ldrPin);

 Serial.println(“Key present value: ” + String(keyPresentValue));

 //grove speaker

 pinInit();

 //led

 pinMode(7, OUTPUT);

}

void loop() {

 // put your main code here, to run repeatedly:

 //distance

 measure();

 //ldr

 ldrValue = analogRead(ldrPin);

 if (distance <= 40) {

   Serial.println(“Distance: ” + distance);

   //if key is on the holder, shout

   if (keyPresentValue – 70 < ldrValue && keyPresentValue + 70 > ldrValue) {

     Serial.println(“ldrValue: ” +String(ldrValue));

     Serial.println(“TAKE YOUR KEYS”);

     //grove speaker

     for (int note_index = 0; note_index < 7; note_index++) {

       sound(note_index);

       delay(100);

     }

     //led

     unsigned long currentMillis = millis();

     if (currentMillis – previousMillis >= interval) {

       //save last time you blinked LED

       previousMillis = currentMillis;

       //if LED is off, turn it on and vice versa

       if (ledState == LOW) {

         ledState = HIGH;

       }

       else {

         ledState = LOW;

       }

       digitalWrite(ledPin, ledState);

     }

     //for debugging, and console readability

     //delay(1000);

   }

   else {

     Serial.println(“GOOD JOB FOOL.KEYS TAKEN.”);

     //for debugging, and console readability

     //delay(1000);

   }

 }

}

Documentation 6: Processing Basics

For my first drawing in processing, I wanted to create something simple and learn how to use the basic functions first. I chose this piece by Sol Lewitt because I liked how simple it was and the colors fit together quite well. While drawing in processing, I wanted to increase the thickness of each line I drew and then add the colors when necessary. I had come to realize that this would be a grave mistake, the only way I could color the image was by the “fill” function. After drawing my lines for the second box, I had to remove those coordinates and replace them with rectangles. I had an extremely difficult time processing the numbers and calculation in my mind. After adding the rectangles over the individual lines, it was difficult to figure out how to manipulate the thickness level to make the picture look more uniform.

The Motif:

My drawing:

My final creation still needs a lot of work before it looks like the photo in the model. Both images have the same outline and my second box has some of the coloring. However, I was not able to add the lines of the other two boxes because of time constraints. Processing was an okay means of realizing my design. Personally, it is really difficult to recalculate the numbers on the x and y axis’ so if there is a better method for that, it would be much more efficient to create these drawings.

The Code

void setup()

{

 size (600, 600);

 background(255);

 strokeWeight(10);  // Thicker

 line(20, 40, 580, 40);

 strokeWeight(10);  // Thicker

 line(20, 40, 20, 580);

 strokeWeight(10);  // Thicker

 line(20, 580, 580, 580);

 strokeWeight(10);  // Thicker

 line(580, 40, 580, 580);

 

 //inner lines

 strokeWeight(10);  // horizontal

 line(20, 300, 580, 300);

 strokeWeight(10);  // vertical

 line(300, 40, 300, 580 );

 

 //box 1

 strokeWeight(10);  // horizontal

 line(75, 40, 75, 300);

 line(130, 40, 130, 300);

 line(185, 40, 185, 300);

 line(240, 40, 240, 300);

 

 //box 2

 strokeWeight(10);  // horizontal

 

 fill(255, 255,0);

 rect (300, 40, 280, 40);

 

 //rect (300, 80, 280, 30);

 //line(300, 80, 580, 80);

 fill(255, 255,0);

 rect(300,110, 280, 30); // rect (x,y, width,height);

//  line(300, 110, 580, 110);

//  line(300, 140, 580, 140);

 

 line(300, 170, 580, 170);

 line(300, 200, 580, 200);

 line(300, 230, 580, 230);

 line(300, 260, 580, 260);

 

}

void draw()

{

}

Recitation 4: Drawing Machines

Recitation 4: Drawing Machines

In today’s recitation, we created motors with three total steps. The first step was to create a motor which simply rotated. This was the most simple step out of all three because we only needed to connect the wires, breadboard, motor, and Arduino together. Attached below are the photos and videos of this step.

The second step was a bit more complicated because we wanted to control the rotation with a potentiometer. Adding the potentiometer to the breadboard was an extra step but I had a tough time figuring out the code for this part, especially the map() function. Attached below is the photo of step 2. The video size is too big to be attached.

The last step was the most exciting, the mechanical arm functioned will and we were able to draw something abstract. It’s really interesting to see how every part of the machine works together. After this project, I saw machines in a parts instead of as a whole. The video size is too large to be displayed.

Question 1: What kind of machines would you be interested in building?Add a reflection about the use of actuators, the digital manipulation of art, and the creative process to your blog post.

I would be interested in building a machine that can throw paintballs at people. Through the use of actuators and motors, we can launch paintballs at the wall which would be a massive canvas. This would be an interesting idea for artists to paint on greater surfaces. In my opinion, art can change politics and through the use of this technology, we can have a bigger forms of art, a bigger audience, and a bigger impact. The digital manipulation of art demonstrates the new age in digital art combining photography and animations together. Being able to play with different techniques and mediums allows anybody to become an artist, all they need is their imagination. Personally, I have a hard time being creative and finding something to create. However, it is really inspirational to see the art that some people choose to make.

Question 2: Choose an art installation mentioned in the reading ART + Science NOW, Stephen Wilson (Kinetics chapter). Post your thoughts about it and make a comparison with the work you did during this recitation. How do you think that the artist selected those specific actuators for his project?

I chose “The Drumming and Drawing Subhuman” from the art installation mentioned in the reading ART + Science NOW. On the surface it seems extremely different from the drawing machine that I have built, but the Robot Subhuman is able to drum through actuator and motors, just like the drawing machine. The design of the robot itself is quite creative, something that looks horrifying but can evoke elegance through its drumming. My project has a long way to go compared to the Robot, but perhaps more similar in the techniques involving rotation and other motors. The artists selected the actuators that would help the motors rotate around the hands and elbows.