• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Interaction Lab
  • Comm Lab

Jade's Document Blog

Archives for March 2022

Recitation 5

March 24, 2022

We tried the tilt switch for the first time. And we used the serial plotter to observe some charateristics of it.

If tilting the switch nearly 90 degree, it goes from ON to OFF.

If tapping on the switch, the wave flickers.
 
The result of tilting fingers and tilting forearm is similar.

If I hold the wires and not the switch and tilt it, the signals become more unstable. There are lots of transitions in between tilts.

If I shake it with holding hands holding the wires, it misses lots of signals. If I grab it by the top and shake it, the waves are more accurate.

 

 

Workout STARTS

We debounced the swtich first, so the unintended transitions didn’t be affected the counts.


Biceps curls

When I finished the first version of the code, I found that the system counted on every switch of actions(e.g., 1 on up and 2 on down). But I wanted it to be counting on each up-down which means a full curl. Then Andy helped me with modifying the code. I added “if (reading == HIGH)” in the code, so it only counted when the tilted switch was on.

const int SENSOR_PIN = 6;
int tiltVal;
int prevTiltVal;
int curls = 0;
unsigned long lastDebounceTime = 0;  // the last time the output pin was toggled
unsigned long debounceDelay = 50; // the debounce time; increase if the output flickers


void setup() {
  pinMode(SENSOR_PIN, INPUT);    // Set sensor pin as an INPUT pin
  Serial.begin(9600);
   
  }

void loop() {
  int reading = digitalRead(SENSOR_PIN);
  // if the tilt sensor value changed, print the new value
  if (reading != prevTiltVal) {
    lastDebounceTime = millis();
    //Serial.println(tiltVal);
  }
  if ( (reading != tiltVal) && (millis() - lastDebounceTime) > debounceDelay) {
    // whatever the reading is at, it's been there for longer than the debounce
    // delay, so take it as the actual current state:
    tiltVal = reading;
   

    if (reading == HIGH) {
      curls = curls + 1;
      Serial.println(curls);
     
    }
  }
  
  if (curls == 8) {
    Serial.println("Yay, you've done a set of curls");
    curls = 0;
  }
  
  
  prevTiltVal = reading;
   //for Serial Plotter use
  //Serial.println(tiltVal);
  //delay(1);
}

Jumping jacks

The previous code correctly counted for jumping jacks. But if I speeded up, it didn’t counted accurately. We can see in the Serial Plotter that when the tilt switch is on, the wave is not stable. So I tried to decrease the number of the delaydebounce time, and the counts was able to keep up with the faster movement. However, it sometimes included extra counts. And if I increased the debounce lockout time, it missed counts. The best debounce time was 70.

 

Timing the workout


const int SENSOR_PIN = 6;
int tiltVal;
int prevTiltVal;
int curls = 0;
int duration = 0;
unsigned long lastDebounceTime = 0;  // the last time the output pin was toggled
unsigned long debounceDelay = 50; // the debounce time; increase if the output flickers


void setup() {
  pinMode(SENSOR_PIN, INPUT);    // Set sensor pin as an INPUT pin
  Serial.begin(9600);
  delay(1000);
  Serial.print("Start your workout");

}

void loop() {
  int reading = digitalRead(SENSOR_PIN);
  duration = millis();
  // if the tilt sensor value changed, print the new value
  if (reading != prevTiltVal) {
    lastDebounceTime = millis();
    


    //Serial.println(tiltVal);
  }
  if ( (reading != tiltVal) && (millis() - lastDebounceTime) > debounceDelay) {
    // whatever the reading is at, it's been there for longer than the debounce
    // delay, so take it as the actual current state:
    tiltVal = reading;
    if (duration <= 20000) {
      if (reading == HIGH) {
        curls = curls + 1;
        Serial.println(curls);

      }

    } else {
      Serial.println("STOP,time is up");
      curls = 0;



    }

    //if (curls == 8) {
    //Serial.println("Yay, you've done a set of curls");
    //curls = 0;
  }


  prevTiltVal = reading;
  //for Serial Plotter use
  //Serial.println(tiltVal);
  //delay(1);
}

Filed Under: Interaction Lab, Uncategorized

Recitation 4

March 17, 2022

We made a simple drawing machine.

Step 1

First we connected a steppor motor and a H-bridge to the Arduino. The connections seemed complicated but it was actually clear which cables go to which pins.

https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/5cdb6d5c70a0bc8fddd2c9cfef19fe02.mp4

Step 2

Then we added a potentiometer to the circuits to control the motor. The mapping in the code was hard. I understand what mapping means and how it works, but it’s hard to remember how to write the code. The code: value=map(value, from low, from high, to low, to high)

https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/3fcba497dd494393802eccbac509b811.mp4

Step 3

We used lasor-cut arms to hold the paper and connect attached them to the motors(one arm to one motor). It was difficult to draw what we wanted though, since it needs cooperation between two arms.

https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/8d5bbf52f90916d3b7240d785ae7a661.mp4

 

Question 1

I always wish there was a showering machine. People can simply step in the machine and automatically be cleaned, like automatic car washes. It needs “arms” and “hands” that can move around, tranmitting program information into movements like an

acurator. According to the hieght and the size of the user, there will be a fixed formula controlling where they move. The fluffy hands can produce soap foam when press itself slightly against user’s body, and moves around the skin to clean. And water comes out from the hands with shower head to rinse off the soap.

Question 2

Kobito:Virtual Brownies made by Auki Takafumi et al. is a design that connects the physical world and virtual world. A person holds a tea caddy while seeing through the computer screen, there are small virtual characters standing next to the tea caddy. If the tea caddy was moved, it could touch the characters and on the other hand, when the characters pushed the tea caddy, it would also move in the physical space. Although the trick has been revealed that they used magnetic acutators, I still wonder how it corresponds to the digital creatures. I guess the magnet attached to the tea caddy is measured in x-y spacial position and is connected to the computer. In other words, the computer knows the spatial relationship between the virtual characters and the position of the magnetic, so they can decide their reaction if thier positions coincide with each other. It is similar to our circuits as both of them include device that convert the signals into the instructions of motions: our motors are contolled by the potentiometers and rotate, while the position-activated program tells the magnet to move.

 

 

Reference: https://drive.google.com/file/d/1MH0D7mTve4KQVn9FJ0KqXWa2TgGuJkPR/view

 

Filed Under: Interaction Lab, Uncategorized

Reflect

March 11, 2022

1.

Dance is one form of art, but usually requires skills and talent. We hope people can express their emotions through art or just simply perform art without limitations. The invention of our group is the device that creats a choreography based on the emotion of the person who touches it, and can lead the person dancing. 

To come up with the idea, we first met as a group and then introduced our own inventions we created at the RESEARCH phase. And all of us found the one from Robbin was very interesting, which was some robots dancing along with the Naxi music band the main character watched with the woman in The Fish of Lijiang. We made an advanced version of his proposal:

The invention is mainly a pair of artificial hands. When a person places his/her hands on that pair of hands, the projector automatically turns on and the image of its body appears. Meanwhile, it detects the physiological and mental conditions of the person that recognize his/her emotion. It then creates a choreography with moves reflecting the emotion. Holding the hands, people can dance along with the robot without learning it in advance or memorizing the choreography. 

In our performance, we included a cardboard projector and a cardboard helmet for the robot. When the projector is turned on, the robot took off the helmet representing her appearance.

We also used masks with mouth drawn on it to represent the emotion of the users. The first user was sad and they danced in a sad mood 🙁 while the other was happy because her friend gave her a gift and they danced happily together 🙂 . And because we perform without talking, we had a little cardboard instrument that played the beats for the dance.

We evenly distrubuted our work. Since most of the scenes were between two actors, we discussed the story and the dance in small groups two by two seprately. As a first sad character myself, I discussed with Vivian about the trigger that made me sad; and with Tracy, the robot, we created the choreography that showed the sadness. Our group members had participated in the cardboard workshop where they made some of the items, and the rest we did as a whole group. We also rehearsed for several times.

Our performance in class:

https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/perform1.mp4
https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/perform2.mp4
https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/perform3.mp4

One thing I like about our invention is that it is very interesting–imagine yourself being able to dance in a way you never think you are capable of. And it allows people to express their feeling, which is one thing that is vital for people dealing with stress and depression. 

It’s interactive because it aligns to how we define interaction. It is a conversation between two subjects and it reacts to each other. The robot doesn’t activate itself until the user touches it and the user can also receive input, which is the guidance of dance.

However, there are also limitations in our invention. When we came up with the idea, we didn’t specify how the emotions are detected, meaning, what kind of things are sensed and are taken in to account. Neither did we specify how people are controlled by the robot with just the connection between hands. 

2.

The group which presented a subjective clock and a objective clock was really impressive. Their story setting was a student taking GPS in school. There was a helmet on the student’s head that can detect his braiwave. And there were two clock, one objective which was the time running in the real world and one subjective–what the student experience personally. When the student feel bored his subjective time slows down. This was in relation to the concept of The Fish of Lijiang where the time can be distorted and felt differently person from person. But from my perspective, instead of relating to the fictional concept The Fish of Lijiang presents, they were talking about the experience we are going through everyday, since sometimes we feel like time are passing slow when we feel bored and fast when we wish time stops at the moment forever.

The performance was very interesting. They assigned the two clocks with characteristics that made them communicate with each other to better convey their ideas to the audience. It reminds me of the cartoon movie Inside Out, in which the emotions and every mental activities are living under a real life setting with a system to run the community. 

But I doubt they met the criteria of making it interactive. There was only a one-direction process, which was the device detecting the student’s brain. The brain outputs information, the device processes and the clocks input. But the student didn’t receive any feedback to further react to the other subject. One thing they can think about is that after the two clocks appear, the student also receives the information, and is able to manage or play with it.

Filed Under: Interaction Lab, Uncategorized

Recitation 3

March 2, 2022

We first tried to put only the sensor, which was the ultrasonic ranger. It can detect distance. And then we added output– a buzzer. Initially, we intend to make buzzer sounds louder as the distance get smaller, but were told that this buzzer can only make the same volunme. So we then changed our goal which is that the buzzer makes sound when the distance is longer than 20. The coding was the hardest part. Even though we can copy and paste the code from the website and examples from the Arduino, we need to figure out ways to combine them correctly. Additionally, there is something we want to improve on. The reaction duration of buzzer switching on and off is quite longer. I’d like to make it more sensitive if I have time.

https://wp.nyu.edu/jadewen/wp-content/uploads/sites/23875/2022/03/reci3-inout.mp4

Question 1

We had a buzzer as an output and an ultrasonic ranger as an input in the circuit. The buzzer starts to make sound after the sensor detects the distance is farther than 20cm. This kind of assmbly can be used by a group of hikers. For example, if they are venturing in a dangerous landscape, they can have this kind of technology attaching to their bodies. If one person is separate from his/her parnters in a certain distance, the alarm goes on and reminds them that there is one member falling behind the team.

Question 2

The one who reads the code is the computer. It only reacts to the right instructions and the right combination of instructions. So a programmer needs to be careful when coding, following the rules of coding like following the recipe if want to have a perfect meal.

Question 3

When we get used to computer, we become lazy because computer can help us do a lot of things. And it also shapes different habbit for respective generation as computer develop throughout time. For example, my grandparents and parents get used to have keyboards and mouses when facing computer, but my generation get used to screens. 

The code:

// ---------------------------------------------------------------- //
// Arduino Ultrasoninc Sensor HC-SR04
// Re-writed by Arbi Abdul Jabbaar
// Using Arduino IDE 1.8.7
// Using HC-SR04 Module
// Tested on 17 September 2019
// ---------------------------------------------------------------- //

#include "pitches.h"
#define echoPin 2 // attach pin D2 Arduino to pin Echo of HC-SR04
#define trigPin 3 //attach pin D3 Arduino to pin Trig of HC-SR04

// defines variables
long duration; // variable for the duration of sound wave travel
int distance; // variable for the distance measurement
int notes[] = {
  NOTE_A4, NOTE_B4, NOTE_C3
};

void setup() {
  pinMode(trigPin, OUTPUT); // Sets the trigPin as an OUTPUT
  pinMode(echoPin, INPUT); // Sets the echoPin as an INPUT
  Serial.begin(9600); // // Serial Communication is starting with 9600 of baudrate speed
  Serial.println("Ultrasonic Sensor HC-SR04 Test"); // print some text in Serial Monitor
  Serial.println("with Arduino UNO R3");
}
void loop() {
  // Clears the trigPin condition
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  // Sets the trigPin HIGH (ACTIVE) for 10 microseconds
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  // Reads the echoPin, returns the sound wave travel time in microseconds
  duration = pulseIn(echoPin, HIGH);
  // Calculating the distance
  distance = duration * 0.034 / 2; 
  
  
  if (distance > 20){
    // play the note corresponding to this sensor:
      tone(8, notes[1], 20000);
  }
  
  // Speed of sound wave divided by 2 (go and back)
  // Displays the distance on the Serial Monitor
  Serial.print("Distance: ");
  Serial.print(distance);
  Serial.println(" cm");
}

 

Filed Under: Interaction Lab, Uncategorized

Primary Sidebar

Recent Posts

  • Visual Metaphor Proposal
  • Reverse Storyboard
  • Audition: A Memory Soundscape
  • Reading Response
  • Diptych

Recent Comments

  • A WordPress Commenter on Hello world!

Archives

  • November 2022
  • October 2022
  • September 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022

Categories

  • Comm Lab
  • Interaction Lab
  • Uncategorized

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Copyright © 2025 · Brunch Pro Theme on Genesis Framework · WordPress · Log in