• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • Project Documentation
  • Nature of Code S24
  • Remade in China F20
  • Interaction Lab F20

Tina's Blog

Creating My Own Story with Technological Media

Categories

  • Home
  • Project Documentation
  • Nature of Code S24
  • Remade in China F20
  • Interaction Lab F20

Archives for November 2020

Recitation 9: Media Controller by Tina(Tianyu Zhang)

November 29, 2020 by Tianyu Zhang Leave a Comment

Reflection after Reading 

In this recitation, I used a photoresistor on Arduino to control the play and pause of my media which is displayed through Processing. The initial of me trying this interaction is mainly to get myself more familiar with the usage of photoresistor as well as some basic interaction function about media in Processing. 

After reading the article, Computer Vision for Artists and Designers: Pedagogic Tools and Techniques for Novice Programmers, I have reflected on myself about my created interaction. In the article, it listed out several vivid examples of how computer vision interaction can be. Interacting with or creating one’s own video through body movements, detect people’s motion with the use of computer vision technique, and even embed or used to develop robots. Compared with those examples, my interaction that was done in the recitation seems far too simple. It seems like I have only done a simple test about how to use physical elements to control some basic properties of a simple short and already existing and chosen video. Therefore, if I have the chance, I want to learn more about how to make the users have a higher interaction level with the computer. For example, what I am going to do in my final project, trying to figure out how to let users draw a painting on the screen with body movement.

Video Documentation:

https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/Reci9.mp4

Code:

Arduino:

// LAB8 - 讀取光敏電阻 (v1)

int photocellPin = 2; // 光敏電阻 (photocell) 接在 anallog pin 2
int photocellVal = 0; // photocell variable

void setup() {
  Serial.begin(9600);
}

void loop() {
  // 讀取光敏電阻並輸出到 Serial Port 
  photocellVal = analogRead(photocellPin);
  Serial.println(photocellVal);
  int sensorValue = analogRead(A2);
  Serial.write(sensorValue);  
  delay(100);   
}

Processing: 

import processing.serial.*;
Serial myPort;
int sensorValue;
import processing.video.*;
Movie myMovie;
void setup() {
  size(300,500);
  background(0);
  myMovie = new Movie(this, "myMovie1.mp4");
  myMovie.play();
  printArray(Serial.list());
  // this prints out the list of all available serial ports on your computer.

  myPort = new Serial(this, Serial.list()[13], 9600);
  }
  
void draw() {
  //background(0);
  // to read the value from the Arduino
  while ( myPort.available() > 0) {
    sensorValue = myPort.read();
  }
  println(sensorValue);//This prints out the values from Arduino
  if (myMovie.available()) {
    myMovie.read();
    //image(myMovie, 0, 0, width, height);
  }
  //image(myMovie, 0, 0, width, height);
  //if (sensorValue > 5 && sensorValue < 50){
  //  myMovie.pause();
  //}
  image(myMovie, 0, 0, width, height);
  if (sensorValue > 5 && sensorValue < 50){
    myMovie.pause();
  }
  else{
    myMovie.play();
  }
  
}

Filed Under: Interaction Lab F20 Tagged With: Weekly Mini-Project

Project Proposal Essay

November 27, 2020 by Tianyu Zhang Leave a Comment

Project Title: Integrated Art

Project Statement or Purpose:

The basic idea of our project is to explore if there are really some possibilities that we can make up some ways to integrate different kinds of art. This was inspired by the old common saying that “all arts are bound together”. In reality, we know that people may have different talents in some specific areas, and it may be hard for everyone to be Davinci, an expert in almost all areas. For instance, for a professional dancer, the drawing may be hard. Therefore, we want to use this opportunity to explore some ways for artists in one aspect to make another art piece in another field at the same time. 

 

Project Plan:

Just as the name says, we may create an art space for players to be part of the performance. When they are performing, they may simply create different sounds and move around. There will be an LED screen put behind where they stand. When they move around, the end of the pen will move synchronized with the person, and the tone of the sound will determine the color of the pen. Finally, we want to create this art place to be a stage for performers to draw paintings while dancing and singing. 

Here is the schedule of our project:

Dec.1st – Finish the Processing and Arduino code about how to detect a sound tone and map the sensorValue to a specific color. Determine how the final art space will be like.

Dec.3rd – Decide if there are any devices needed, if available, borrow them, if not, seek the professor’s help and maybe buy some. 

Dec.5th – Find out how to detect a person’s movement through the external camera, and map the value to a certain position on the screen.

Dec.6th – Try to connect the devices to the finished code, try if those work. 

Dec.7-9th – Debug and further develop. 

Context and Significance: 

Although the main idea and inspiration of this project which Wendy and I are going to focus on mainly stem from my own opinions and thoughts, we did integrate many interesting ideas we thought of together and some research that we have done together. For example, if time is enough, we may create modes other than drawing while dancing and singing for the players, instead, we may add some modes like making music with computers by giving instructions and interaction through dancing or painting. This is the result of combining the other proposal about making pieces of music by putting letters into the specific box, which was inspired by many projects we searched online. 

In terms of the interaction, I think this final project that we are going to create aligns with my definition well. In my definition posted early this semester, I put great emphasis on the necessity of computer interaction back with players’ interaction and that this interaction can and should be continuous. This final project that we are going to create does meet the requirements, since this will be a performance stage which will give the audience an immersive performing and creating the experience, and the feedback the device gives out will be continuous. 

Filed Under: Interaction Lab F20 Tagged With: Project Documentation

#11: Journal of User-Testing

November 25, 2020 by Tianyu Zhang Leave a Comment

Journal

If you didn’t get the chance to test your prototype with your community partner, record a video where they test it.
Post on the blog a summary of the feedback you received with the questions you made and answers you got. (Include some photos or any media you collected)
Finally post a short paragraph with your reflection on what you learned in this session and explaining the next steps you are taking to finish with the prototype in the next weeks.

Asking the manager, Mary Kate, to do user testing, however, she said that they are only available next week. So, we are going to have an appointment with them next week. 

Filed Under: Remade in China F20 Tagged With: Project Documentation

Recitation 8: Serial Communication by Tina(Tianyu Zhang)

November 20, 2020 by Tianyu Zhang Leave a Comment

[In-class Exercise]

Interaction included: 

In this project, the users are making interactions with the computer through Arduino and Processing. The users are able to change the position of the two ends of the line to draw a piece of a painting by knobbing the two potentiometers. With the change of mouse positions, the users will finally be able to draw a painting. 

Video documentation:

https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/RECI8_Video_cpsd-1.mp4

Reflection on the Interaction:

The interaction is really interesting since the users can draw paintings without even using a pen. From the perspective of innovation, it is a really interesting interaction. However, maybe adding more sensors or other things may be able to enrich the user experience. For instance, we may read more indexes like the mouse positions or others to change the color of the painting.  

Code attached: 

Arduino:

void setup() {
  Serial.begin(9600);
}

void loop() {
  int sensor1 = analogRead(A0);
  int sensor2 = analogRead(A5);

  // keep this format
  Serial.print(sensor1);
  Serial.print(",");  // put comma between sensor values
  Serial.print(sensor2);
  Serial.println(); // add linefeed after sending the last sensor value

  // too fast communication might cause some latency in Processing
  // this delay resolves the issue.
  delay(100);
}

Processing: 

import processing.serial.*;

String myString = null;
Serial myPort;
float preX;
float preY;

int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/


void setup() {
  size(800, 400);
  background(0);
  setupSerial();
}


void draw() {
  updateSerial();
  printArray(sensorValues);

  // use the values like this!
  // sensorValues[0] 
float posX = map(sensorValues[0],0,1023,0,width);
float posY = map(sensorValues[1],0,1023,0,height);
  // add your code
  stroke(255);
  line(preX,preY,posX,posY);
  //
  preX = posX;
  preY = posY;
}



void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[5], 9600);
  // Change the PORT_INDEX to 0 to find where your port is.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

 

[Additional Homework]

Interaction included: 

Video documentation:

https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/Reci_8_AH_cpsd.mp4

 

Reflection on the Interaction:

Code attached: 

Arduino:

void setup() {
  Serial.begin(9600);
}

void loop() {
  int sensor1 = digitalRead(9);
  int sensor2 = digitalRead(10);
  //int sensor3 = analogRead(A2);

  // keep this format
  Serial.print(sensor1);
  Serial.print(",");  // put comma between sensor values
  Serial.print(sensor2);
//  Serial.print(",");
//  Serial.print(sensor3);
  Serial.println(); // add linefeed after sending the last sensor value

  // too fast communication might cause some latency in Processing
  // this delay resolves the issue.
  delay(100);
}

 

Processing: 

import processing.serial.*;

String myString = null;
Serial myPort;


int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
//int[] sensorValues;      /** this array stores values from Arduino **/
int preButtonValue0 = 0; //initial value of the first button is 0
int preButtonValue1 = 0;
int[] buttonValues;

boolean star1Display = false;
boolean star2Display = false; // initial state of the star display is false


void setup() {
  size(800, 500);
  background(0);
  setupSerial();
}


void draw() {
  updateSerial();
  printArray(buttonValues);
  background(0);
  // use the values like this!
  // sensorValues[0] 

  if (buttonValues[0] == 1 && buttonValues[0] != preButtonValue0){
    star1Display = !star1Display;
  }
   if (buttonValues[1] == 1 && buttonValues[1] != preButtonValue1){
    star2Display = !star2Display;
  }
  if (star1Display){
  pushMatrix();
  translate(width*0.3, height*0.5);
  rotate(frameCount / 400.0);
  star(0, 0, 80, 100, 40); 
  popMatrix();
  }
  
  if (star2Display){
  pushMatrix();
  translate(width*0.7, height*0.5);
  rotate(frameCount / -100.0);
  star(0, 0, 30, 70, 5); 
  popMatrix();
  }

  //previous value of button is also changing so we update it as well
  preButtonValue0 = buttonValues[0];
  preButtonValue1 = buttonValues[1];
  // add your code
  //pushMatrix();
  //translate(width*0.2, height*0.5);
  //rotate(frameCount / 200.0);
  //star(0, 0, 5, 70, 3); 
  //popMatrix();
  
  //pushMatrix();
  //translate(width*0.5, height*0.5);
  //rotate(frameCount / 400.0);
  //star(0, 0, 80, 100, 40); 
  //popMatrix();
  
  //pushMatrix();
  //translate(width*0.8, height*0.5);
  //rotate(frameCount / -100.0);
  //star(0, 0, 30, 70, 5); 
  //popMatrix();
  //
}

void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[13], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  buttonValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          buttonValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

 

Filed Under: Interaction Lab F20 Tagged With: Weekly Mini-Project

Final Project Step 2

November 19, 2020 by Tianyu Zhang Leave a Comment

Idea 1: Integrated Art (Gesture and Voice control Painting)

Target group: art-lovers, creative musicians, painters, and dancing artists 

This project stems from my random thoughts about whether different arts can be related and communicated with each other. Is that possible for a single individual to simultaneously use different art techniques to create a brand new piece of art? Then, I thought of the project one of my friends once made, the rescuing arm game made by Lydia Yan. I thought it might be possible for a dancer to draw a painting with the use of some sensors putting on her body to detect her body movements. Also, since I am a good singer but am so poor at drawing, I always wonder if I can draw something beautiful with my singing. Therefore, I want the tone of sounds to determine the color of the painting. I thought this would be so unrealistic for me to achieve, however, the video showed in class gave me insights and encouragement of finishing that. The videos show that people can play with Processing and Arduino by the shadow of the body project on the canvas and also can use eyeballs to draw the picture. We want to create a project which combines dancing, music, and painting together. It allows participants to draw the shapes and lines by their gesture and color the picture by the frequency of the voice. The target groups of this project are creative musicians, painters, and dancing artists.

Idea 2: Custom Alphabet Sheet Music

Target group: People who want to play music but lack music knowledge, NGOs

This project is inspired by German Artist Pablo Paredes’s “Stay Home” project. In the project, he used the shapes of 8 letters to create a piece of music. The music is amazing and warm for it reminds people to stay home during the COVID-19 epidemic. We found this project really significant to society and meaningful, therefore, we would like to base on this project, develop it, and integrate our own views about COVID-19 into our project. In the project, users may be able to express what they want to say with this project and turn those words into music. 

For the letter input part, we want to use different laser-cut woods and pressure sensors connect with Arduino. For the music score part, we want to use the Processing of the animation of the letter score and control the music and music effect technique by connecting with Arduino. We want to use this project to make people who want to play music but has a lack of music knowledge to play their own customized music and also this project can be used by healthy organizations or other NGOs who want to appeal to the public to take some actions by the music. Because it’s interesting that when you hear the music by the shapes of letters, it will remain deeply in your mind. So this project has a pretty great advocacy function.

Idea 3: Covid-19 Music&Animation Disinfecting System

Target group: Household residents

It’s a household use system which can make people happily disinfect themselves after returning home. This project is inspired by the smart restroom system which I mentioned in my final preparatory blog. The smart restroom system allows people to play small games while using the toilet. In our project, It’s a semi-game device that can help people to clean and disinfect before entering the room in COVID-19 situation through Arduino and Processing. We will use Arduino to create devices that can automatically open and close the lid of the trash can, and automatically squeeze the bottle of disinfectant and hand sanitizer. The devices will be used through the ultrasonic sensor to measure the distance between the human hand and become automatic. At the same time, we will connect these devices to Processing. By using Processing, we can create animations to remind people to use the devices in sequence when they enter the room. Each time a step is completed, the Processing animation will switch to a reminder of the next step. And it will be accompanied by music that will change from happy to sad if the cleaning and the disinfecting process is not done in order or missing. We also want to connect a led cube that can show the animation of the process in a three-dimensional way, such as the animation of hand sanitizer dripping or the animation of disinfectant spraying. The completion of the action can be detected by a photoresistor. When the participants use hands to block the photoresistor on the devices, it means they have used these devices already.

Such projects are also relatively educational because they can remind people to disinfect themselves when they enter the home and play a role in preventing the virus during the COVID-19 epidemic. The animation and music reminders are also interesting because they make participants want to experience the whole disinfection process and encourage them to disinfect self-consciously when entering the room.

Filed Under: Interaction Lab F20 Tagged With: Project Documentation

Recitation 7: Functions and Arrays by Tina(Tianyu Zhang)

November 19, 2020 by Tianyu Zhang Leave a Comment

Question 1: In your own words, please explain the difference between having your for loop from Step 2 in setup() as opposed to in draw().

Having a for loop in setup() means that once the processing starts to work, the for loop will finish at once and display all the outcomes directly on the screen/canvas. However, when putting the for loop in draw(), although we human beings may not be able to observe very obvious distinctions, the function is actually running round by round after the processing starts. So, the outcomes are actually changing, or to say, adding to the previous one all the time. 

Question 2: What is the benefit of using arrays? How might you use arrays in a potential project?

By using arrays, we can simply and directly have many different but also well-organized values of variables. Before having an array, if we want to create two different circles moving in different directions or speeds, we have to create each variable two times. However, with the use of an array, we can simply create one array instead, and reach the same outcomes. 

In my project, I will potentially use arrays to store the continuous values that are extracted from the sensors on the Arduino so as to give users continuous changing feedback to make the experience richer. 

Below are my code and video-documentation: 

int n = 100;
float[] x = new float[n];
float[] y = new float[n];
float[] d = new float[n];
int[] r = new int[n];
int[] g = new int[n];
int[] b = new int[n];
float speedX[] = new float[n];
float speedY[] = new float[n];

void setup(){
  size(1000,600);
  background(255);
  for(int i =0; i<n; i=i+1){
    x[i] = random(width);
    y[i] = random(height);
    d[i] = random(10,200);
    r[i] = int(random(0,255));
    g[i] = int(random(0,255));
    b[i] = int(random(0,255));
    speedX[i] = random(2,10);
    speedY[i] = random(6,12);
  }
  
}

void draw(){
   for (int i=0; i<n; i++){
    display(x[i], y[i], d[i],r[i],g[i],b[i]);
    //face (posX[i], posY[i], color(random(255),random(255),random(255)),size[i])
  }
  move();
  bounce();
}

void display(float x, float y, float d, int r, int g, int b) {
  //Use these parameters to create your graphics
  fill(r,g,b);
  noStroke();
  circle(x,y,d);
  fill(r,g,b+d);
  noStroke();
  rect(x=d,y-d,x+d,y+d);
  fill(r,g,b+d);
  noStroke();
  triangle(x-d,y+d,x+d,y-d,x-d,y=d);
}

void move(){
  for (int i=0; i<n; i++){
  x[i] = x[i] + speedX[i];
  y[i] = y[i] + speedY[i];
  }
}
void bounce(){
  for (int i=0; i<n; i++){
  if(x[i]>(width-10/2) || x[i]<(0+10/2)){
    speedX[i] = -speedX[i];
  }
  if(y[i]>(height-10/2) || y[i]<(0+10/2)){
    speedY[i] = -speedY[i];
  }
  }
}

https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/Screen-Recording-2020-11-13-at-12.08.03_1605802041148175.mp4

 

Below are the code and video-documentation for the additional homework:

//how many circles are there 
int col=25, row=25;
int totalNum = col*row;
int padding = 140;
int defaultSize = 10;
//in the outcomes we can see that the positions of the circles have slightly changed
float [] xPositionsOriginal = new float [totalNum];
float [] yPositionsOriginal = new float [totalNum];
//currentlt changing positions of th circles
float [] xPositions = new float [totalNum];
float [] yPositions = new float [totalNum];

float [] sizeFactors = new float [totalNum];

//lerp value
float lerpFactors = 0.05;

void setup(){
 size(800,800);
 
 for (int i = 0; i<col; i++){
   for (int j = 0; j< row; j++){
     //i = j, value: 0-24
     int index = i + j * row;
     xPositionsOriginal [index] = map (i, 0, col-1, padding, width-padding);
     yPositionsOriginal [index] = map (j, 0, row-1, padding, height-padding);
     //set positions to the original ones
     xPositions [index] = xPositionsOriginal [index];
     yPositions [index] = yPositionsOriginal [index];
     sizeFactors[index] = 1.0;
   }
 } 
}

void draw(){
 background(255);
 for (int i= 0; i<totalNum; i++){
   //distance between the mouse and circle positions
   float offset = dist(mouseX, mouseY, xPositions[i], yPositions[i]);
   //make sure offset will not be 0
   offset = max(offset, 0.001);
   
   //how to change the offset based on the mousePosition
   //offset value based on the mouse position >> for further calculation
   float newX = xPositionsOriginal[i] - (25/offset)*(mouseX-xPositionsOriginal[i]);
   float newY = yPositionsOriginal[i] - (25/offset)*(mouseY-yPositionsOriginal[i]);
   
   xPositions[i] = lerp(xPositions[i], newX, lerpFactors);
   yPositions[i] = lerp(yPositions[i], newY, lerpFactors);
   
   //dist smaller > closer > size bigger
   sizeFactors[i] = 100/offset;
   
   //narrow sizeFactors down so calculation won't be affected
   sizeFactors[i] = max (sizeFactors[i], 0.4);
   sizeFactors[i] = min (sizeFactors[i], 2);
   
   //draw circles
   drawCircles(xPositions[i], yPositions[i], sizeFactors[i]);
 }
    
}

void drawCircles(float x, float y, float size){
  pushMatrix();
  translate(x,y);
  scale(size);
  fill(0);
  noStroke();
  ellipse(0, 0, defaultSize, defaultSize);
  popMatrix();
  
}

https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/step2_1605802031800314.mp4
https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/step3_1605802031973039.mp4
https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/step4_1605802032147053.mp4

Filed Under: Interaction Lab F20 Tagged With: Weekly Mini-Project

#10: Articulate Learning – Reflection Essay

November 18, 2020 by Tianyu Zhang Leave a Comment

Key of Service-Learning

By: Tina (Tianyu Zhang)

Student Learning Goal Area: Approaches, Personal, Academic

What did I learn?

How did I learn it?

Why does it matter?

What might / should be done in light of it? 

In terms of approaches, I learned that the hyphen in the word “service-learning” should be interpreted as reflection and that communicating with a community is never an easy task. Before meeting with the community members, we have to first estimate which direction our conversation would go in, then brainstorm a list of questions that we may ask in the conversation. However, when encountering some shy customers, it is usually not useful if we directly ask them about their needs. Instead, we can start by chit-chatting with them in order to make them relaxed, then ask about their daily schedules, and try to find some problems or questions that can be discovered further. This gradual conversation progress was learned from my first meeting with the cafeteria staff, one of them is active to speak, while the other just simply nodded her head. Therefore, we chit-chat with her a little bit and tried to discover what she really thought about. This method is important not only in the field of doing service-learning but can also be applied to all kinds of negotiation and communication in which you need to deeply understand the other’s thoughts and specifically raise your proposal. 

In terms of personal, I discovered that I am sometimes too aggressive and too urgent in expressing myself and convincing others. I did not sense that it was a problem before. I am not extremely extroverted, actually, sometimes I was even a little bit introverted, however, maybe because of the personal experience of always being an emergent leader in the group, I tended to get used to the outcome that everyone would agree on my decision. This time, in the group, I met a girl who also shared a strong personality with me, and because of our different values and understandings, we often have conflicts in our decisions, and we cannot even convince each other. This makes me feel that I may have stayed in my comfort zone for too long, and have forgotten the feeling of being convinced by others. Therefore, I tried to reflect on myself and warned myself about being patient in listening to others’ opinions, being an active listener. I think this is really a meaningful experience for me to get a better understanding of myself and grow up to be a better me.

Last but not least, in terms of academics, I learned that the process of making a product is never an easy task. It contains many steps, including doing background research, doing material experiments, making prototypes, getting feedback from users, and polishing products. I learned the whole process during this course. Following the steps that the professor assigned us, I experienced how hard but excited the whole process is. And I think with this experience, I will be able to produce more products that I want to build.

Filed Under: Remade in China F20 Tagged With: Project Documentation

#10: Question Guide

November 18, 2020 by Tianyu Zhang Leave a Comment

Question Guide (Both English and Chinese version attached, since our targeted customers only speak Chinese)

General Questions

What is your first impression of seeing this chair back support? 

您好,方便简单说下您第一眼看到这个座椅后背的感受么?

Can you describe what excites you or confuses you at your first sight of seeing this?

您能简单描述一下看到这个座椅后背时,有什么地方是您激动或者疑惑的么?

Specific Questions

Actually, we designed this with the purpose of making a short time of relaxation more comfortable. Do you feel like that will improve your current relaxation conditions? 

其实,我们设计这个座椅后背的初衷是为了能够让你们在短暂的休息时间里休息得更舒服一些。想请问您觉得这个设计能够改善你们的休息条件么?

This back support will be attached to the stools downstairs. Do you feel like that will be useful for you?

这个座椅后背会被安装在b1食堂工作区域的那个小板凳上,你们觉得这个设计对你们来说会有使用价值么?

Open-up Questions

This back support is now simply made of used plastic bags, do you feel like that will be comfortable for you, or do you have any other suggestions or functions that you hope this back support can have?

这个座椅后背现在只是简单的用废弃塑料袋编织而成,你们觉得这个靠背对你们来说会舒服么,或者你们有没有什么其他的需求或建议?

Filed Under: Remade in China F20 Tagged With: Project Documentation

Final Project Step 1

November 12, 2020 by Tianyu Zhang Leave a Comment

https://www.youtube.com/watch?v=02z_yyHAm-U   first clip: 3:54-4:20; second clip: 6:00-6:55

First of all, I want to restate my definition that I wrote in my previous research blog post. Interaction between humans and machines means that machines can react to what humans say or do or any kinds of output and that humans can continuously react and interact with the machines. And this continuous and changeable process between humans and machines should be called interaction. 

Therefore, I researched on the website in order to find some projects that fit my definition. Since this would be inspirations for my final project, I just searched online about some previous projects that ITP/IMA students made. In the YouTube video, ITP/IMA Winter Show 2018, I found two projects that really drew my attention. 

The first one is a theremin that can neither be seen nor be touched physically. They basically used the AI and AR technology to realize their project goal, creating a virtual theremin. The user can simply stand in front of the desk, and make some movements or doing some gestures in a certain area, and through that, they can easily produce music. However, the most sparkling point in this project is that it allows two players to interact with the machines simultaneously with different feedback gained. Besides the user who produces music with their movements, the other user can sit down in front of the display screen, actually see the virtual theremin, and even listen to the music produced by the other user with the use of an earphone. This realization of interacting with several users and giving them distinct feedback really can be seen as a perfect illustration of my definition of “changeability”. This also aligns with the concept mentioned by Ernest Edmonds in the article Art, Interaction, and Engagement that, “Each action leads to a response that, in turn, encourages or enables another action.” 

The second example is about raising people’s awareness of our reliability on the smartphone. The project contains several wood-made little puppets, which surround a place where the smartphone is put. When a smartphone is put in that certain area, the puppets surrounding will soon be attracted by the smartphone, and start to keep moving their heads up and down repeatedly. And when the smartphone is taken away from that place, the puppets will become normal again. This process is really vivid in showing the relationship between humans and smartphones. It has a social impact and also the users can continue playing with this prototype, which fits my idea of “continuity”. 

Filed Under: Interaction Lab F20 Tagged With: Project Documentation

Recitation 6: Processing Animation by Tina(Tianyu Zhang)

November 12, 2020 by Tianyu Zhang Leave a Comment

During this recitation, I mainly learned about basic animation functions, especially practicing with the basics that we learned in the lectures, and was trying to create some projects by myself. 

At the very first beginning, I wanted to try to make an animated icon with the Processing. However, it seemed that simply located each part and drew the whole icon has taken most of my time already. Therefore, I have really little time to further develop my ideas and even my first try of the push effect that should be triggered by the mouse clicking on the keys also seemed to be a failure. 

However, later on, I found out that we can actually import pictures into Processing, instead of drawing them by coding step by step. 

Here are some of the functions that I searched on the website that may help me further polish my project:

Import Imgaes:
image()
Examples
example pic
PImage img;

void setup() {
// Images must be in the “data” directory to load correctly
img = loadImage(“laDefense.jpg”);
}

void draw() {
image(img, 0, 0);
}
example pic
PImage img;

void setup() {
// Images must be in the “data” directory to load correctly
img = loadImage(“laDefense.jpg”);
}

void draw() {
image(img, 0, 0);
image(img, 0, 0, width/2, height/2);
}

PShader
Examples
PShader blur;

void setup() {
size(640, 360, P2D);
// Shaders files must be in the “data” folder to load correctly
blur = loadShader(“blur.glsl”);
stroke(0, 102, 153);
rectMode(CENTER);
}

void draw() {
filter(blur);
rect(mouseX-75, mouseY, 150, 150);
ellipse(mouseX+75, mouseY, 150, 150);
}

mouseDragged()
Examples
// Drag (click and hold) your mouse across the
// image to change the value of the rectangle

int value = 0;

void draw() {
fill(value);
rect(25, 25, 50, 50);
}

void mouseDragged()
{
value = value + 5;
if (value > 255) {
value = 0;
}
}

PFont
Examples
example pic
PFont font;
// The font must be located in the sketch’s
// “data” directory to load successfully
font = createFont(“LetterGothicStd.ttf”, 32);
textFont(font);
text(“word”, 10, 50);

applyMatrix()
Examples
example pic
size(100, 100, P3D);
noFill();
translate(50, 50, 0);
rotateY(PI/6);
stroke(153);
box(35);
// Set rotation angles
float ct = cos(PI/9.0);
float st = sin(PI/9.0);
// Matrix for rotation around the Y axis
applyMatrix( ct, 0.0, st, 0.0,
0.0, 1.0, 0.0, 0.0,
-st, 0.0, ct, 0.0,
0.0, 0.0, 0.0, 1.0);
stroke(255);
box(50);

DIY Animation (Code):

//PShader blur;
float posX;
float posY;

void setup(){
  size(400,400);
  background(0);
  /*circle(width/2,height/2,200);
  rectMode(CENTER);  
  rect(width/2,height/2,40,80);*/
  
}

void draw(){
    //if(mouseX>width/2-25-30 && mouseX<width/2-25+30){
    //}
  }

void keyPressed(){
    float r = random(255);
    float g = random(255);
    float b = random(255);
    draw();{
    fill(r,g,b);
    noStroke();
    circle(width/2,height/2,300);
    fill(b,r,g);
    stroke(0);
    strokeWeight(5);
    rectMode(CENTER);  
    rect(width/2,height/2,50,160);
    rect(width/2-50,height/2,50,160);
    rect(width/2-100,height/2,50,160,10,0,0,10);
    rect(width/2+50,height/2,50,160);
    rect(width/2+100,height/2,50,160,0,10,10,0);
    fill(0);
    noStroke();
    rect(width/2-25,height/2-35,30,90);
    rect(width/2-75,height/2-35,30,90);
    rect(width/2+75,height/2-35,30,90);
    /*if(posX>width/2-25-30 && posX<width/2-25+30 && posY<height/2-35-90 && posY>height/2-35+90){
      fill(#63665f);
      stroke(0);
      strokeWeight(10);
      rect(width/2-25,height/2-35,30,90);*/
    }
    } 
    

Video Documentation: 

https://wp.nyu.edu/tina_zhang/wp-content/uploads/sites/17387/2020/11/reci6横屏compressed.mp4

Additional Homework (Code) :

float posX = 300;
float posY = 300;
float speedX = 2;
float speedY = 2;
float s = 100;
boolean state = true;
void setup() {
  size(600, 600);
}

void draw() {
  background(360);
  strokeWeight(20);
  noFill();

  circle(posX, posY, s);

  if (state == true) {
    s = s + 4;
  } else if (state == false) {
    s = s - 4;
  }
  if (s == 400 || s == 100) {
    state = !state;
  }
  colorMode(HSB, 360, 100, 100);
  stroke(s, 100, 100);
  //colorMode(RGB, 255, 255, 255);
  //float r = random (360);
  //float g = random (100);
  //float b = random (100);


  if (keyPressed == true) {
    if (keyCode == UP) {
      posY = posY - speedY;
    }
    if (keyCode == DOWN) {
      posY = posY + speedY;
    }
    if (keyCode == LEFT) {
      posX = posX - speedX;
    }
    if (keyCode == RIGHT) {
      posX = posX + speedX;
    }
  }
}

Filed Under: Interaction Lab F20 Tagged With: Weekly Mini-Project

  • Page 1
  • Page 2
  • Go to Next Page »

Primary Sidebar

What did I test?

  • NOC-Project B-Beyond Screen-Trace溯 May 16, 2024
  • NOC-W10-Autonomous Agent_FISH 2.0-Tina April 15, 2024
  • NOC-W09-Weird Spiral April 8, 2024

What is the day?

November 2020
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
30  
« Oct   Dec »

Control Center!

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Footer

Archives

  • May 2024 (1)
  • April 2024 (2)
  • March 2024 (3)
  • February 2024 (3)
  • January 2024 (1)
  • October 2021 (1)
  • December 2020 (4)
  • November 2020 (14)
  • October 2020 (7)
  • September 2020 (8)
  • May 2020 (1)
  • April 2020 (11)
  • March 2020 (7)
  • February 2020 (5)

Copyright © 2025 · News Pro on Genesis Framework · WordPress · Log in