• Skip to main content
  • Skip to primary sidebar

Celia's Documentation Blog

NYU Shanghai IMA

Interaction Lab

Final Project: Preparatory Research and Analysis

April 8, 2021 by Celia Forster Leave a Comment

In Ernest Edmonds’ chapter titled “Art, Interaction, and Engagement”, different types of interaction are described, as well as common characteristics seen in interactive art. My greatest takeaway from this chapter was the statement, “Interactive art systems involve artefacts and audiences equally” (Edmonds 16). This idea that a truly interactive exhibit must equally incorporate the two actors is essential in creating a successful interactive project. This got me thinking about how different types of interaction can be displayed in works of art.

While researching recent interactive art installations, I was particularly looking for unconventional pieces that incorporated different bodily senses. The first artifact which attracted my attention was Michal Kohut’s 2010 piece titled ‘0,1’, which features a pair of glasses that interact with the user’s blinking patterns. When the user blinks, the lights in the room momentarily shut off for the duration of the blink. The user does not notice the blink of the lights, but the audience members experience the blinking patterns of the user and essentially get to see the world through someone else’s eyes. I thought that this installation was an interesting example of interaction because it is very simple in concept, but still satisfies the definition of interaction I tend to use, described by Chris Crawford in The Art of Interaction Design. To be considered interactive, the relationship must involve two actors that both actively accept input and then return an output accordingly. In this case, the glasses are interactive, because they take the actions of the user (blinking) and convert it to a visual result for audience members (light behavior).

from https://vimeo.com/45921590

Another interactive art installation that piqued my interest was ‘While nothing Happens’ by Ernesto Neto, which was located in the Macro Museum in Rome from 2008-2009. This piece features lycra netting hanging from the ceiling filled with various spices. As visitors walk through the installation, their bodies brush against the many sacks of spices, which releases an aroma. I was drawn to the fact that this exhibit relies on the sense of smell, and can produce a different result each time depending on the movement of the users. Again, this piece fits the criteria of interactivity because the actions of the users produce a corresponding display from the art itself. What I found unique about this installation was that, unlike many interactive art installations, it does not rely on technology to create the desired result.

‘While Nothing Happens’ from https://www.designboom.com/art/while-nothing-happens-by-ernesto-neto/

Through building my own interactive project at midterm and viewing my peers’ interactive pieces, my perspective on interactivity has developed. While I initially considered anything that responds to user interaction to be considered interactive, I now believe that it must go beyond this base level interaction. An interactive device cannot only produce a singular display every time, rather it must respond to the user and initiate a continued exchange that continues beyond the initial contact.

Filed Under: Interaction Lab

Recitation 6: Processing Animation by Celia Forster

April 6, 2021 by Celia Forster Leave a Comment

In this recitation, I used Processing to create an interactive animation, shown in the video below. It is an animated cartoon face that moves its eyes according to the position of the mouse on the screen and smiles when the mouse hovers over its face. After trying to recreate many different animations shown on the recitation page, I decided on this one, because it was interactive compared to some of the others, which only featured a looping animation. To create this, I did some research online to learn how to make the eyes move in a certain area. This lead me to the map function, which allows you to scale values proportionately to fit your project. Besides this essential function, this code only required some simple code to set up the shape, and an if statement for the mouth to turn upside-down depending on the mouse position.

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/color_brush-2021-04-06-15-06-14.mp4
void setup(){
  size(500 ,500);
  
}

void draw(){
  background(255,100,128);
fill(255);
  strokeWeight(14);
    circle(150, 200, 125);
  circle(350, 200, 125);
  noFill();


  if(mouseX >200 && mouseX <300 && mouseY >300 && mouseY <400) {

  arc(250, 350, 100, 50, -PI/30, PI);
}
else{
  arc(250, 350, 100, 50, -PI, 0); 
}

float pupilx1 = map(mouseX, 0, width, 100, 190);
  float pupilY = map(mouseY, 0, height, 165, 240);
  float pupilx2 = map(mouseX, 0, width, 300, 390);
  fill(0);
  circle(pupilx1, pupilY, 25); 
  circle(pupilx2, pupilY, 25); 
   
}

Homework

This circle grows and shrinks while continuously changing color. I also added a function for it to move according to the arrow keys on the keyboard, making it stop when it reaches the border of the canvas.

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/sketch_210402a-2021-04-06-15-53-36.mp4
float d = 200;
float speed = 4;
float x = 300;
float y= 300;
//int rad=25;
int h = 0;
int i = 50;
int j = 200;
void setup() {
  size (600, 600);
  background(255);
  colorMode(HSB, 255);

  noFill();

  strokeWeight(20);
}
void keyPressed() {
  if      (keyCode == UP    && y >= 50     ) {
    y -=5;
  } else if (keyCode == DOWN  && y <= height-50) {
    y +=5;
  } else if (keyCode == LEFT  && x>= 50    ) {
    x -=5;
  } else if (keyCode == RIGHT && x<= width-50 ) {
    x +=5;
  }
}
void draw() {
  background(255);
  stroke(h, i, j);

  circle(x, y, d);
  d = d + speed;

  if (d > 400 || d < 75) {
    speed=-speed;
  }
  h++;
  i++;
  j++;
  if (h==255) {
    h=0;
  }
  if (i==255) {
    i=50;
  }
  if (j==255) {
    j=200;
  }
}

Filed Under: Interaction Lab

Midterm Project: Individual Reflection

March 28, 2021 by Celia Forster Leave a Comment

Meow Box – Celia Forster – Cossovich

CONTEXT AND SIGNIFICANCE 

While my midterm project was rather different than the group research project, both fell under the common theme of robots that benefit humans in some way. Of course, the research project’s product was far more futuristic and unrealistic to fabricate in the real world. But in making the Meow Box, I tried to implement some of the ideas of the human interaction experience. When conducting research for the midterm project, I was drawn to the OpenCat, an Arduino project by Rongzhong Li featured on the Arduino Project Hub. This robot was quite advanced in that the result was a robot that could walk and jump in the manner of a cat. Unfortunately, material and time constraints would prevent me from fabricating such a sophisticated robot, so I decided to stick to a more simple model that would still respond to human interaction. Unlike many robot cat products, I think that my fabrication choices were successful in creating a very realistic-looking cat. I feel that by making the cat as realistic as possible with realistic fur and cat noises, I offer an alternative to traditional robot pets that look very mechanical. This project could be of use to people of all ages or groups, but I think children may find a greater deal of enjoyment from it because it acts as more of a toy and older audiences may become bored of its features more quickly.

 

 CONCEPTION AND DESIGN:

When designing the Meow Box, I first had to answer the fundamental question: how do humans interact with cats? The first thought that came to my mind was right on top of the cat’s head. I initially felt that this location of interaction would be fairly self-explanatory, but eventually added a sign that directed users to this location following the feedback of user testing. My choice to use faux fur turned out to be successful in creating a realistic experience for users to feel like they were interacting with a live animal. This was also why I decided to use the 3D printer for the cat head. Because the head was such a central component of the overall design, I wanted the most structurally stable and visually accurate form. Looking back, I feel that the greatest flaw within the design was the size of the cat itself. I wanted the entire cat to be self-contained in the box and have no visible wires. This became a struggle, as the external battery pack, Arduino board, MP3 player shield, and breadboard had difficulty fitting within the body of the cat. As for the materials used, I initially planned to use felt to cover the cat and give it a soft texture. However, because the project relied so heavily on touch, I thought a more realistic texture would enhance the user experience. This is what prompted the long white fur to be used to cover the head, body, and tail. 

FABRICATION AND PRODUCTION:

During the user testing session, I encountered a critical issue with the Meow Box prototype. Nearly every user (before prompts from myself and my partner) believed that the cat did not actually do anything besides shaking its tail. This was ultimately due to a few issues, namely the low volume of the speaker and too high of a threshold on the FSR sensor to initiate action. Users suggested we fix these errors and make a few changes. This included having the cat be stationary until the user interacts, as this would signal to the user that their actions were, in fact, prompting action from the cat. They also encouraged me to continue with the idea of having the cat display different reactions depending on the type of action from the user. A common theme during user testing was positive feedback about the appearance and fabrication of the cat itself. This reaffirmed by my decision to use faux fur to craft the cat. By the time the final Meow Box was completed, various improvements were made. This included having the cat remain stationary and play a purring sound until the user touches the head. The touch on the head prompts different speeds of tail movement and a meow or hiss sound effect depending on the pressure applied to the sensor.

 CONCLUSIONS:

The Meow Box was designed with the intent of providing users with a feline companion in the case that they cannot be physically with a cat of their own. For me personally, being away from my cat has been difficult, so building the Meow Box has been a comforting experience. Just like the definition of interaction by Chris Crawford and expanded in my own previous blog posts, the Meow Box acts as an actor working in conjunction with the other actor, the user. The Meow Box displays different behaviors when interacted with in different ways. As Crawford mentions, interactivity is subjective, in that some groups might find an artifact interactive while others do not, such as a child opening a refrigerator door vs. an adult opening a refrigerator door. By this logic, some sophisticated users may feel the Meow Box is not interactive, because it displays the same reaction, essentially upon the push of a button. Because they can understand that the cat is not actually “reacting”, they may feel it is not interactive enough. But returning to the original classification of the Meow Box as an interactive artifact, the key fact is that the users touch and the cat’s response in return is an interaction. If given more time, I would enhance the interactive experience by adding multiple sensors and more reactions. This would give more variety to the user experience because the current two different reactions can become rather repetitive and subtract from the interactive feeling. Building this project truly put my circuit and coding abilities to the test. I had little difficulty with the actual fabrication of the Meow Box because my talents more closely align with the crafting aspect of the project. By contrast, using the MP3 player shield and FSR sensor provided a completely unknown experience, and I felt that with the many failures that came along the way, I learned a great deal of beneficial knowledge that I can utilize in future projects. I hope that this project can bring users that are missing a cat of their own a pleasant experience, and that they ultimately can see the care and attention that went into building the Meow Box.

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/03/WeChat_20210328123546-1.mp4

The 3D printer beginning the process of printing the cat head

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/03/WeChat_20210328123625-1.mp4

Testing the tail mechanism

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/03/WeChat_20210328141640.mp4

The final Meow Box in action!

The finished Meow Box
The final Meow Box with presentation poster
The body of the cat, made from cardboard and faux fur
The 3D printer preview
The 3D printed cat head
The fabricated box to hold the cat
The cat head attached to a stand repurposed from recycled materials

 



// Define FSR pin:
#define fsrpin A0

//Define variable to store sensor readings:
int fsrreading; //Variable to store FSR value

#include <Servo.h>
#include <Arduino.h>
#include "DYPlayerArduino.h"
#include <SoftwareSerial.h>
// Initialise on software serial port.
SoftwareSerial SoftSerial(10, 11);  //RX TX
DY::Player player(&SoftSerial);


Servo myservo;  // create servo object to control a servo
// twelve servo objects can be created on most boards

int pos = 0;    // variable to store the servo position


void setup() {
  // Begin serial communication at a baud rate of 9600:
  Serial.begin(9600);
   myservo.attach(9);  // attaches the servo on pin 9 to the servo object
     // Also initiate the hardware serial port so we can use it for debug printing
  // to the console..
  Serial.begin(9600);
  
  Serial.println("Starting the Player...");
  player.begin();
   
  delay(500);

  player.setPlayingDevice(DY::Device::Sd);
  player.setVolume(30); // 
  // player.setCycleMode(DY::PlayMode::Repeat); // Play all and repeat.

}

void loop() {
  player.playSpecified(1);
  int track = 1;
  // Read the FSR pin and store the output as fsrreading:
  fsrreading = analogRead(fsrpin);

  // Print the fsrreading in the serial monitor:
  // Print the string "Analog reading = ".
  Serial.println("Analog reading = ");
  // Print the fsrreading:
  Serial.println(fsrreading);

  // We can set some threshholds to display how much pressure is roughly applied:

  if(fsrreading < 20) {
    
  //  player.interludeSpecified(player, 1);
  track = 0;
     player.playSpecified(track); 
//       for (pos =50; pos <= 85; pos += 1) { 
//    myservo.write(pos);              
//    delay(10);                   
//  }
//  
//  for (pos = 85; pos >= 50; pos -= 1) { 
//    myservo.write(pos);             
//    delay(10); 
//  }  
 myservo.detach();
 // myservo.write(60);  
 delay(1700); //
  }    
  
  
                   
  
else if (fsrreading > 20 && fsrreading < 80) {

track = 2;
    myservo.attach(9);  
    player.playSpecified(track); 
   //  player.playSpecified(2);
  
    for (pos =30; pos <= 75; pos += 1) { 
   
    myservo.write(pos);              
    delay(20);                   
  }
  
  for (pos = 95; pos >= 50; pos -= 1) { 
    
    myservo.write(pos);             
    delay(20); 
  }
   delay (300);
  }
else if (fsrreading>80){

  track = 3;
//player.stopInterlude();
  myservo.attach(9);  
 player.playSpecified(track); 
//    player.playSpecified(3);

  
    for (pos =20; pos <= 105; pos += 1) { 
      
    myservo.write(pos);              
    delay(1);  
     player.playSpecified(track);                  
  }
  
  for (pos = 105; pos >= 20; pos -= 1) { 
    myservo.write(pos);             
    delay(1); 
  }
  delay (600);
//player.playSpecified(3);

 
}
//else if(fsrreading < 20) {
//  //  player.interludeSpecified(player, 1);
//  track = 0;
//     player.playSpecified(track); 
// }
}

Filed Under: Interaction Lab

Recitation 5: Processing Basics by Celia Forster

March 27, 2021 by Celia Forster Leave a Comment

In this recitation, we drew inspiration from various pieces of computer art to draw our own piece with Processing. I was drawn to a 1985 piece created by Vera Molnár, a French artist known as a pioneer of computer art. I enjoyed how the use of a single shape and color was applied in such a way that still creates a coherent and visually interesting composition. 

Molnár, Vera. (1985). Unknown. Retrieved from https://spalterdigital.com/artworks/1980/

Using this piece as a motif for my own artwork, I began in Processing by adding some simple rectangles to the canvas. Because I wanted to use many rectangles just like in the inspiration, I immediately added a for loop with a variable for the x-value. This would allow me to quickly make identical rectangles appear across the canvas using the least amount of code. By accident, I added the variable to the color fill value, and the rectangles achieved a colorful gradient. Intrigued by this, I began to stray further from my original plan and started experimenting with different color combinations and increment values in the for loops until I enjoyed the result. It is definitely a loose take on the motif, as this one employs many more colors and a different pattern altogether. However, the original similarity of rectangles remains.

My final drawing on Processing

In my opinion, using Processing was very straightforward and convenient for making such a project. If I were to use Processing to make a piece of artwork again, I think using a program such as Adobe Illustrator could allow me to more quickly realize my design in Processing, as using Processing required a lot of guessing and checking to get the ideal sizing and positioning. If I used Illustrator to layout my design, I would be able to see the values of position and size before inputting them into my code. 

My code:

void setup() {
  background(255,255,255);
  size(600,600);
}

void draw(){
int y = 0;
  for(int x = 0; x <590; x += 20) {
  
        noStroke();
    fill(0 + y,0,255);
  rect(x,10,15,10+y);
    y+= 12;
}
y = 250;
int yco = 10;
for(int x2 = 0; x2 <590; x2 += 20) {
  
        noStroke();
    fill(0,0, 0 +y);
  rect(x2,yco,15,y+150);
  if (y >0) {
    y -= 12;
  }
    yco += 30;
}
y=0;
int ycor = 0;
for(int x3 = 40; x3 <590; x3 += 20) {
  
        noStroke();
    fill((0+(y-15)),(0+(y-15)),255);
  rect(x3,48+ycor,15,40 + (y ));
    y+= 14;
    ycor += 12;
}

}

Filed Under: Interaction Lab

Recitation 4: Drawing Machines by Celia Forster

March 15, 2021 by Celia Forster Leave a Comment

In this recitation, we used an H-bridge L293D IC chip which connected to a stepper motor to create a drawing machine. The first step in this process was to build a circuit which would successfully control a stepper motor. This process could potentially harm my computer, so I was very cautious in first using an external USB power source to test the circuit. When I first ran the stepper_oneRevolution code, my motor was motionless, which prompted me to look closer at the breadboard. I very carefully compared my board to the provided schematic and noticed I was missing a few cables connecting the H-bridge to the power source. This was the first time I independently troubleshooted an error in my circuit, so I felt very excited when the stepper motor suddenly began moving!

Breadboard for the successful circuit

The next step was to add a potentiometer to the circuit and run the MotorKnob program to control the direction and duration of motor movement.

Video of stepper motor moving according to potentiometer position

The final component of this recitation was to combine my stepper motor with a partner to create a drawing machine. Using 3d printed and laser cut pieces, we attached the components and a pen and watched out stepper motors work together to draw on a piece of paper. The artwork may not have been intentional, but it was still interesting to see the machine work.

Video of drawing machine at work

The finished drawing
Question 1: What kind of machines would you be interested in building? Add a reflection about the use of actuators, the digital manipulation of art, and the creative process to your blog post.

Quite fitting with this recitation, I am most interested in machines that create art. I found it fascinating to watch the arms of the machine we made in this recitation move and draw according to my controls. On a related note, I have long been interested in A.I. art, where machine algorithms generate a piece of artwork. The first time I heard the term “actuator” was actually during this recitation. I think that these components are incredibly useful, as they are quite versatile and have can actually perform actions that are visible. Sometimes, being unable to see the process of a component makes me lose interest, but it is nice to see actuators at work. The digital manipulation of art is something that I personally enjoy and believe can really enhance artwork. Because it is a relatively new development in the world of art, trends are always changing, which is what makes it so interesting to me. As for the creative process, it is something that I have subconsciously done with every project, but have only recently understood the specifics of it. It is important to acknowledge each step of this process when creating something in order to get the most thoughtful and successful result. This process is of particular importance as I work on my midterm project.

Question 2: Choose an art installation mentioned in the reading ART + Science NOW, Stephen Wilson (Kinetics chapter). Post your thoughts about it and make a comparison with the work you did during this recitation. How do you think that the artist selected those specific actuators for his project?

I was particularly drawn to Daniel Palacios Jiménez’s 2006 piece titled Waves. I think it has a unique appearance, because the motion of the components make it appear like a hologram or digitally fabricated piece of art. Compared to the drawing machine made in recitation, there is one major difference. In this piece, the motion of the machine is the artwork itself, whereas with the drawing machine, the motion of the motors is only part of the process, and the final drawing is the actual piece of art. I think the artist had to carefully choose the actuators used in this project, because it looks like it requires very fast motors to create the desired effect, so speed was likely an important factor in deciding which actuator to select.

Filed Under: Interaction Lab

Recitation 3: Sensors by Celia Forster

March 7, 2021 by Celia Forster Leave a Comment

In our third recitation, we chose different sensors and used them with our Arduino to make a circuit which integrated our chosen circuit. I chose the vibration sensor, which was created with a piezo disc and a 1 mega ohm resistor.

Vibration sensor connected to Arduino
Piezo Disc

I first connected the sensor to my Arduino and tested it by viewing the output in the serial monitor. Once I saw that the sensor was properly working, I used the built-in LED code to connect the Arduino the a breadboard with an LED. When the sensor detected vibration from my fist on the desk, the LED would light up.

Video of LED

After successfully completing the previous circuit, I decided to use the mini Servo included in the Arduino kit. This required me to add my own code to the original Knock code. I took a look at some of the example codes which used a Servo to understand how to integrate a Servo into my code. This process required a lot of trial and error to finally get the code working. It had been years since I last studied coding, so this exercise was a nice review. 

Video of Servo

Code:

/*
  Knock Sensor
Original code:
  created 25 Mar 2007
  by David Cuartielles <http://www.0j0.org>
  modified 30 Aug 2011
  by Tom Igoe

  This example code is in the public domain.

  http://www.arduino.cc/en/Tutorial/Knock
  (With self-written code to connect to Servo)
*/

#include <Servo.h>

Servo myservo;  // create servo object to control a servo
// twelve servo objects can be created on most boards



// these constants won't change:

const int knockSensor = A0; // the piezo is connected to analog pin 0
const int threshold = 150;  // threshold value to decide when the detected sound is a knock or not


// these variables will change:
int sensorReading = 0;      // variable to store the value read from the sensor pin

void setup() {
 
  myservo.attach(9);  // attaches the servo on pin 9 to the servo object
  Serial.begin(9600);       // use the serial port
  
}
int pos = 0;
void loop() {
  // read the sensor and store it in the variable sensorReading:
  sensorReading = analogRead(knockSensor);
  
  // if the sensor reading is greater than the threshold:
  if (sensorReading >= threshold) {
   Serial.println("Knock!");
   delay(100);  // delay to avoid overloading the serial port buffer
  
          myservo.write(pos +=10);    


}}

Diagram of servo connected to sensor via Arduino

Question 1: What did you intend to assemble in the recitation exercise? If your sensor/actuator combination were to be used for pragmatic purposes, who would use it, why would they use it, and how could it be used?

In this recitation exercise, I intended for the Servo motor to turn a certain amount with each detected vibration by the sensor. This was not meant to be serve any specific function, but I can see the many places where this movement could be utilized. Because a touch on a surface creates vibration, a pragmatic use of this combination could be in security. For example, if someone touches an item which is off-limits, a safety measure will be set off and block said person from touching the item. Although this example could be seen as more fictional (like a booby trap in a movie), I think there could still be practical uses of it, in settings such as museums.

Question 2: Code is often compared to following a recipe or tutorial.  Why do you think that is?

I can understand why some would compare code to a recipe or tutorial, because it is a very methodical process. To understand how a code functions, you must look it at from top to bottom, step-by-step. In a recipe or tutorial, the steps are being given to the reader to carry out, but with code, the steps are being fed to a computer which, if written correctly, will output the intended result. I think this metaphor makes code (something that seems complicated) feel more simple to the average person who is familiar with tutorials or recipes.

Question 3: In The Language of New Media, Manovich describes the influence of computers on new media. In what ways do you believe the computer influences our human behaviors?

As Manovich mentions in The Language of New Media, “computerization affects deeper and deeper layers of culture” (27), which directly influences human behaviors. The main way in which I believe the computer influences our human behavior is through communication. Manovich discusses how human language is considered discrete, but cultural communication through media is not universally so discrete (29). Because of this, human behaviors have evolved and human language is no longer the same as it was before the rise of computers and modern media.

Filed Under: Interaction Lab

2D Fabrication with Gravit Designer

March 1, 2021 by Celia Forster Leave a Comment

We learned how to use Gravit Designer in class to design objects that could be fabricated with the laser printer. Given an example of wooden rings which stack and form houses, I followed a video tutorial to make the rings and familiarize myself with the software. Since I am very familiar with Adobe Illustrator, using Gravit Designer was fairly easy to adjust to, and I quickly got the hang of the functions.

Provided image of the rings
My design created on Gravit Designer

 

Filed Under: Interaction Lab

Group Research Project – REFLECT

March 1, 2021 by Celia Forster Leave a Comment

Group photo!

For the group project performance, my group decided to use the fictional world of “Folding Beijing” to create an interactive artifact. This artifact is the ability for Third Space citizens who have lost their jobs due to automation or disability to take a job as a robot to serve citizens of the First Space. Since the waking hours of each space differs, Third Space citizens who hold this job will go to sleep as normal in their sleeping pod, however an additional helmet-like attachment will allow them to control the robot in their subconscious. Just like how the interactive Pocky display artifact I discovered during the Research phase of the project fits the definition of interactivity, the artifact we designed also takes input from the user (the Third Space citizen) and outputs work (service in the First Space). This artifact thus fits the criteria of being an interactive object. In terms of successes, I feel that my group did a decent job creating a unique and new artifact that fits well within the “Folding Beijing” storyline and adequately solves a real problem the characters in this story may face under the restrictions of their world. When fabricating our artifact, we put in effort to create a helmet-like dome and painted it black to lower onto the head of the main character using the cords in the classroom. On the other side of things, we also used a cardboard box to create the robot head which would exist in the First Space simultaneously as the Third Space user wears the helmet.

Sketch of our props
Fabricating the helmet
Plastic dome used as artifact
Robot head

I think we did a nice job utilizing our environment to create an artifact which of course cannot truly exist in our world due to technological restraints. As for the failures of our artifact, I think the logic behind the device may have some inconsistencies. For example, after performing, an audience member questioned how much autonomy the robot gets and how much control the citizen truly has over the robot, as this could have repercussions such as a potential overthrow or absorption of First Space knowledge. Fabrication-wise, perhaps our artifact was not the most complicated in design, as it did not involve too many materials. To make up for this, we made sure to include a variety of other props that we also fabricated:

 

 

Our performance:

Rehearsal!

I will now analyze the group performance of the “Weather Pod”. This project was essentially a device which had a multitude of functions, including, most notably in their performance, the ability to control weather and a shower. The fabricated artifact was no more than two feet tall, which the group mentioned would not be the actual size had this device been real. It was a model of the intended artifact due to time and material constraints. I felt that although this artifact fit in the “Folding Beijing” storyline, the function was not so essential that it would have made much of a difference in the actual story. Although in their performance, the shower function was what saved the main character from being discovered, the weather function for which the device gets the name “Weather Pod” seemed a bit irrelevant to the story. I do believe that the artifact does meet the criteria of the assignment, and I could tell that this group put in effort to fabricate the device by using a variety of materials and a unique design. Of course the device would have looked more realistic if it had been human-sized, but I understand that this would be nearly impossible to do in the time given. The performance itself was very well done and appeared well rehearsed, with clear effort from the performers. I was engaged throughout the entire scene. If I had some constructive feedback for the performance itself, the ending felt rather abrupt and I was left wondering what would have happened next.

Filed Under: Interaction Lab

Recitation 2: Arduino Basics by Celia Forster

February 9, 2021 by Celia Forster Leave a Comment

In our second recitation, we received our Arduino kit. With these new materials, we built three different circuits which were more sophisticated than those in the previous recitation. Our first circuit was the Fade program built into Arduino IDE.

Fade circuit
https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/02/fade-video.mp4

This circuit was fairly straightforward following the diagram provided on the Arduino website. When the program is run, the brightness of the LED fades in and out. The next circuit I built was the toneMelody program in Arduino IDE. This circuit did not require a breadboard. Instead, I used M-F jumper cables to directly connect the speaker to the Arduino board as shown in the photo below.

toneMelody circuit
https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/02/tone-video.mp4

The final circuit I built in this recitation was substantially more complicated. I used the Tinkercad Speed Game program design to construct my circuit. On my first attempt, the program did not run as expected. This was due to the misplacement of the push buttons and the lack of a jumper cable connecting the speaker and one of the LED lights to ground. After I made these changes, the game ran smoothly, shown in the video below. 

Schematic diagram of speed game

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/02/speed-video.mp4

Question 1: Reflect on how you use technology in your daily life and on the circuits you just built. Use the Introduction Chapter (page xvii-xxix) of Physical Computing and your own observations to define interaction.

As described in Physical Computing, circuits act “as the glue between the transducers you use to sense and control the world and the computers you use to interpret what’s going on” (Igoe XXV). Other technological devices work in the same way in my daily life. Scanning an ID card to enter a building, walking through a temperature scanner; all of these things involve sensing between two actors. Thus, I still stand by my previous definition of interaction which closely resembles the definition provided by Chris Crawford. Interaction is the process of exchanging inputs and outputs between at least two actors.

Question 2: Why did we use the 10K resistor with the push button?

When using the push button, we used a 10K resistor because its capacity is substantially larger than the 220-ohm resistor. This limits the amount of current running through the circuit so that it is not overloaded and will not blow. This is necessary because the push button requires more electricity to run compared to the LEDs.

Question 3: If you have 100,000 LEDs of any brightness and color at your disposal, what would you make and where would you put it?

If I had 100,000 LEDs of any brightness and color at my disposal, I would like to create an interactive art exhibit in a public space. I was inspired by the Happy Interactive Screen introduced in a previous blog post which featured 1,344 Pocky boxes which moved according to the movements of the user. Like this display, I would like to use the LEDs to turn on and off according to users who stand in front of the display.

Filed Under: Interaction Lab

Group Research Project – Short Stories

February 8, 2021 by Celia Forster Leave a Comment

1. Folding Beijing

This short story features protagonist Lao Dao in his daily life, struggling to make ends meet to support his adopted daughter, Tangtang, in her pursuits of musical education. He is hardly at his home as he works his waste processing job and takes on odd jobs for spare money. In order to relieve Lao Dao’s burden, I would create a sophisticated robot caregiver that could watch over Tangtang and other children of the Third Space who are left alone due to their parents’ demanding work schedules. While some robot caregivers already exist, this interactive robot could include cameras that allow Lao Dao to view and control his home environment live. This robot could educate the child and provide music and dance lessons, which, as a result would allow Lao Dao to save his money for other purposes and have to go on dangerous journeys less frequently.

2. Newton’s Sleep

In the utopian society described in this story, problems begin to arise as religious and power tension undermine the pristine community of academics who fled the destroyed earth environment. At one point in the story, it mentions the father, Ike, having to mentally translate Sonny Wigtree’s question “into his own Connecticut dialect” (Le Guin 6). Because there seem to be some language differences among this community, I think that a highly advanced translation system would be beneficial in further uniting this group of people. This interactive artifact would be able to immediately detect dialectical differences in speech and allow the receiving party to instead hear the words in their own native dialect. This would be time efficient and would allow for smoother communication. The only potential issue with this technology is who it would be distributed to initially. Because this society already has some power distribution tension, only providing this service to citizens in higher positions of power would just further the turmoil.

3. The Lifecycle of Software Objects

The level of technology in the world described by Ted Chiang in this story is highly advanced, able to create the digients that develop like human children. If this technology could be used to develop interactive pet devices, they could be very beneficial to this society. Since house pets have a relatively short lifespan, these animal digients could be lifelong companions for their owners, just as real pets but with a longer lifespan and durability. The only problems that could arise are technological malfunctions, but as with the digients in the story, there are qualified people to fix potential glitches.

Filed Under: Interaction Lab

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Go to Next Page »

Primary Sidebar

Archive

Course

  • After Us: Posthuman Media (6)
  • Communications Lab (14)
  • Interaction Lab (22)
  • Working with Electrons (9)

Recent Posts

  • Final Project PART FOUR: Documentation May 16, 2021
  • Final Project: Individual Reflection May 16, 2021
  • 05.04.2021 In-Class Exercise: Pixels! May 5, 2021
  • Interaction Lab Final Project: Progress Entry #1 May 5, 2021
  • Final Project: Progress Entry #2 May 5, 2021

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Copyright © 2025 · News Pro on Genesis Framework · WordPress · Log in