A Humanistic Art of Interaction: Loneliness Killa – Junyi Li – Rudi

In the previous researches, I found the interactive media artifacts that gained great success were of great scale and highly sensitive towards human actions, compared to the things that we are assigned to do during recitations, which were simple and of minor interactions, and  most fatally, couldn’t be even be titled as an interactive media art. Thus, I decided to make something that was of a greater portion of interaction. The project was dedicated to solve a kind of problem and the first thing that popped into my mind was to help with the lonely feeing people have, because it was often the case that partners have to leave one another minding  their own business. The lack of company could often be frustrating and unbearable. As Rudi suggested to us that the best way to eliminate loneliness was not giving artificial accompaniment but telling one that “I was really here for you,” we decided to design something that could give lonely people real company, that there is really someone out there for you. Then we thought of that the most featured signs of human which included ones heartbeat, expressions and temperature. The artifact that we were about to make was to show the other person one’s heartbeat, temperature, and certain simplest messages.

 To acquire the input of one’s heartbeat, the heartbeat senso was needed; to send certain messages, couples of buttons should be used to coordinate with Neopixel 8×8, so that when the user pushed a button, the pixel would show the corresponding message that the user wanted to send; to gain one’s temperature, a more sensitive temperature sensor to the one in our Arduino kit was needed. Our heartbeat sensors were ear clips while temperature sensors were poor in sensitivity. Thus, we bought our own materials. 

With cardboards, we made the shell of our prototype and inside, we connected our cables. By inserting the codes to Arduino Uno. It worked just fine. We succeeded in making the very early prototype of our project and we made it work, which contained only the heartbeat sensor and the message sender. During the user test, things worked fine, but it stroke us that it allowed only one users, because the input and output devices were compiled into the same cardboard box. The user gave instructions while receiving all the outputs of the machine. It was not a loneliness killer, but a self-cycle of messages. Thus, to make it into the real loneliness killer, the output devices and input devices should be distant from one another, because they should be separately had by two remote partners. Thus, we departed the prototype and clearly separated the parts of output and input. And with new cardboards, we made shells for both input and output devices. Then we connected them with long cables. The coding was not much of problem but the connections of cable for input, output, power, ground was of high complexity. We tried multiple times in connecting different devices into the circuits of three Arduino Uno boards and failed for tens of dozens of times until we finally made it work. Consequently, I colored the paper shells green and blue to apply a serious and deep tone to it. We recorded the usage of the machine, and I made it into a video clip, as could be seen below with the conclusions I made. 

         

  

IMG_2676

IMG_2690

IMG_2691

And, here are the codes:

  1. Button & Sensor:

#include <Adafruit_NeoPixel.h>

#define LED_PIN   6

#define pressure A0

#define button1 13

#define button2 12

#define button3 8

#define button4 7

int pressure_val;

int button1_state = 0;

int button2_state = 0;

int button3_state = 0;

int button4_state = 0;

int prev_pressure_val = 0;

 

#define LED_COUNT 64

 

Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);

 

void setup() {

  // 8*8 NEOPIXEL 代码

  strip.begin();          

  strip.show();            

  strip.setBrightness(255);

  Serial.begin(9600);

 

  pinMode(pressure, INPUT);

  pinMode(button1, INPUT);

  pinMode(button2, INPUT);

  pinMode(button3, INPUT);

  pinMode(button4, INPUT);

}

 

void loop() {

  pressure_val = analogRead(pressure);

  button1_state = digitalRead(button1);

  button2_state = digitalRead(button2);

  button3_state = digitalRead(button3);

  button4_state = digitalRead(button4);

  Serial.println(pressure_val);

 

if((prev_pressure_val == 0) && (pressure_val > 200)){

    rainbow1(10);

    delay(500);

    HI(100);

    wo(100);

    zai(100);

    zhe(100);

    prev_pressure_val = 2 ;

  }

  if(pressure_val > 250){

    rainbow1(10);

  }

 

  if(button1_state == 1){

    //我爱你

    wo(100);

    love(100);

    ni(100);

  }

  else if(button2_state == 1){

    //笑脸

    happy(100);

  }

  else if(button3_state == 1){

    //哭脸

    sad(100);

  }

  else if(button4_state == 1){

    //再见

    byezai(100);

    jian(100);

  }

}

 

//funtions used in the project

void HI(int wait) {

  int list_HI[34] = {0,1,2,3,4,5,6,7,11,20,12,19,31,30,29,28,27,26,25,24,47,48,46,49,44,43,42,41,40,51,52,53,54,55};

  int firstPixelHue = 20000;

  for(int i=0; i<34; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_HI[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(400);

  strip.clear();  

  strip.show();

}

 

void I(int wait) {

  int firstPixelHue = 0;  

  for(int i=24; i<40; i++) { // For each pixel in strip…

      for(int c=0; c<16; c += 1) {

        // hue of pixel ‘c’ is offset by an amount to make one full

        // revolution of the color wheel (range 65536) along the length

        // of the strip (strip.numPixels() steps):

        int      hue   = firstPixelHue + c * 65536L / 10;

        uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

        strip.setPixelColor(i, color); // Set pixel ‘c’ to value ‘color’

      }

      strip.show();                // Update strip with new contents

      delay(wait);                 // Pause for a moment

      firstPixelHue += 65536 / 90; // One cycle of color wheel over 90 frames

    }

  delay(100);

  strip.clear();  

  strip.show();

 

void AM(int wait) {

  int list_AM[36] = {7,6,5,4,3,2,1,15,16,30,29,28,27,26,25,24,12,19,39,38,37,36,35,34,33,32,46,49,63,62,61,60,59,58,57,56};

  int firstPixelHue = 0;

  for(int i=0; i<36; i++){

    int      hue   = firstPixelHue + 65536L / 36;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_AM[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

    firstPixelHue += 65536 / 90; // One cycle of color wheel over 90 frames

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void HERE(int wait){

  int list_HERE[44] = {0,1,2,14,17,31,30,29,32,47,48,63,33,34,46,49,62,45,50,61,4,11,20,27,26,21,10,5,6,7,22,24,36,43,52,59,37,42,53,58,38,41,54,57};

  int firstPixelHue = 0;

  for(int i=0; i<44; i++){

    int      hue   = firstPixelHue + 65536L / 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_HERE[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

    firstPixelHue += 65536 / 90; // One cycle of color wheel over 90 frames

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void wo(int wait){

  int list_wo[30] = {16,31,13,18,29,34,45,50,61,30,28,27,26,25,24,23,9,21,35,47,46,44,43,42,54,56,38,52,48,62};

  int firstPixelHue = 30000;

  for(int i=0; i<30; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_wo[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void zai(int wait){

  int list_zai[27] = {13,18,29,34,45,50,32,33,28,20,10,6,21,22,23,26,37,42,53,44,43,41,24,39,40,55,56};

  int firstPixelHue = 40000;

  for(int i=0; i<27; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_zai[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void zhe(int wait){

  int list_zhe[27] = {14,17,12,19,20,21,22,8,24,39,40,55,56,47,46,29,34,45,50,61,35,36,42,54,51,52,38};

  int firstPixelHue = 50000;

  for(int i=0; i<27; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_zhe[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void love(int wait){

  int list_love[36] = {2,3,14,13,12,11,16,17,18,19,20,21,30,29,28,27,26,25,33,34,35,36,37,38,47,46,45,44,43,42,49,50,51,52,61,60};

  int firstPixelHue = 50000;

  for(int i=0; i<36; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_love[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void happy(int wait){

  int list_happy[30] = {1,2,3,4,5,6,8,23,24,39,40,55,57,58,59,60,61,62,48,47,32,31,16,15,18,45,21,25,38,42};

  int firstPixelHue = 50000;

  for(int i=0; i<30; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_happy[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void sad(int wait){

  int list_sad[30] = {1,2,3,4,5,6,8,23,24,39,40,55,57,58,59,60,61,62,48,47,32,31,16,15,18,45,22,26,37,41};

  int firstPixelHue = 50000;

  for(int i=0; i<30; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_sad[i], color); // Set pixel ‘c’ to value ‘color’

    strip.show();                // Update strip with new contents

    delay(wait);                 // Pause for a moment

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void byezai(int wait){

  int list_byezai[38] = {15,16,31,32,47,14,13,12,11,10,9,8,14,17,30,33,46,49,50,51,52,53,54,55,40,39,29,28,27,26,19,35,44,5,21,37,42,58};

  int firstPixelHue = 50000;

  for(int i=0; i<38; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue));

    strip.setPixelColor(list_byezai[i], color);

    strip.show();                

    delay(wait);                

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void jian(int wait){

  int list_jian[28] = {16,17,18,19,20,21,31,32,47,48,49,50,51,52,53,34,35,36,37,25,23,38,39,40,55,56,57,58};

  int firstPixelHue = 50000;

  for(int i=0; i<28; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue));

    strip.setPixelColor(list_jian[i], color);

    strip.show();              

    delay(wait);                

  delay(500);

  strip.clear();  

  strip.show();

}

 

void ni(int wait){

  int list_ni[30] = {16,17,13,3,12,11,10,9,8,31,30,29,19,33,46,49,62,50,45,44,43,42,41,40,39,24,27,21,52,58};

  int firstPixelHue = 50000;

  for(int i=0; i<30; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue));

    strip.setPixelColor(list_ni[i], color);

    strip.show();                

    delay(wait);                

  }

  delay(500);

  strip.clear();  

  strip.show();

}

 

void rainbow1(int wait) {

  for(long firstPixelHue = 0; firstPixelHue < 2*65536; firstPixelHue += 256) {

    strip.rainbow(firstPixelHue);

    delay(wait);  

  }

  strip.clear();

  strip.show();

}

2. Heartbeat Sensor:

#include <Adafruit_NeoPixel.h>

#define BUTTON_PIN   6

 

const int heartbeat = 4;

int heartbeat_val;

int old_val = 0;

 

#define PIXEL_COUNT 64

 

Adafruit_NeoPixel strip(PIXEL_COUNT, BUTTON_PIN, NEO_GRB + NEO_KHZ800);

 

void setup() {

  // put your setup code here, to run once:

  pinMode(heartbeat, INPUT);

  pinMode(BUTTON_PIN, OUTPUT);

  Serial.begin(9600);

}

 

void loop() {

  //put your main code here, to run repeatedly:

  heartbeat_val = digitalRead(4);

  Serial.println(heartbeat_val);

  if((heartbeat_val == 1)&&(old_val == 0)){

    colorWipe(strip.Color(255,   0,   0),300);

  }

  old_val = heartbeat_val;

  delay(10);

}

//

void colorWipe(uint32_t color, int wait) {

  for(int i=0; i<64; i++) { // For each pixel in strip…

    strip.setPixelColor(i, color);         //  Set pixel’s color (in RAM)

  }

  strip.show();

  delay(wait);

  strip.clear();

  strip.show();

}

3. Temperature Sensor:

#include <Adafruit_NeoPixel.h>

#define LED_PIN    6

#define LED_COUNT 64

Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);

 

const int temp = A0;

int temp_val;

 

void setup() {

  strip.begin();          

  strip.show();            

  strip.setBrightness(255);

  pinMode(temp, INPUT);

}

 

void loop() {

  temp_val = analogRead(A0) * (5000 / 1024.0) / 10;

  if(temp_val == 35){

    num35();

  }

  else if(temp_val == 36){

    num36();

  }

  else if(temp_val == 37){

    num37();

  }

  else if(temp_val == 38){

    num38();

  }

  else if(temp_val == 39){

    num39();

  }

  else if(temp_val == 40){

    num40();

  }

  delay(100);

}

 

void num34() {

  int list_34[30] = {0,15,16,31,30,29,28,27,26,25,24,23,8,7,3,12,19,63,62,61,60,59,58,57,56,49,45,35,44,51};

  int firstPixelHue = 8200;

  for(int i=0; i<30; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_34[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

 

void num35() {

  int list_35[34] = {0,15,16,31,30,29,28,27,26,25,24,23,8,7,3,12,19,63,48,47,32,33,34,35,44,51,60,59,58,57,56,55,40,39};

  int firstPixelHue = 8200;

  for(int i=0; i<34; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_35[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

 

void num36() {

  int list_36[37] = {0,15,16,31,30,29,28,27,26,25,24,23,8,7,3,12,19,63,48,47,32,33,34,35,36,37,38,39,40,55,56,57,58,59,60,51,44};

  int firstPixelHue = 8200;

  for(int i=0; i<37; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_36[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

 

void num37() {

  int list_37[28] = {0,15,16,31,30,29,28,27,26,25,24,23,8,7,3,12,19,32,47,48,63,62,61,60,59,58,57,56};

  int firstPixelHue = 8200;

  for(int i=0; i<28; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_37[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

 

void num38() {

  int list_38[42] = {0,15,16,31,30,29,28,27,26,25,24,23,8,7,3,12,19,32,47,48,63,62,61,60,59,58,57,56,32,33,34,35,36,37,38,39,47,48,44,51,40,55};

  int firstPixelHue = 8200;

  for(int i=0; i<42; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_38[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

 

void num39() {

  int list_39[36] = {0,15,16,31,30,29,28,27,26,25,24,23,8,7,3,12,19,32,33,34,35,44,51,60,47,48,63,62,61,59,58,57,56,55,40,39};

  int firstPixelHue = 8200;

  for(int i=0; i<36; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_39[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

 

void num40() {

  int list_40[34] = {0,1,2,3,12,19,28,29,30,31,27,26,25,24,32,33,34,35,36,37,38,39,10,55,56,57,58,59,60,61,62,63,48,47};

  int firstPixelHue = 8200;

  for(int i=0; i<34; i++){

    int      hue   = firstPixelHue + 100;

    uint32_t color = strip.gamma32(strip.ColorHSV(hue)); // hue -> RGB

    strip.setPixelColor(list_40[i], color); // Set pixel ‘c’ to value ‘color’

  }

  strip.show();

  delay(300);

}

Conclusively, the overall of the midterm project was a success. Although during the exhibiting presentation at last the temperature didn’t work well because of some unconscious disconnection of cables. Our project was to the definition of interaction I defined as that the machine can intake human actions and reply to human’s understanding. The interactions our project contained was not at  little amount. It was three, so it was almost for certain that the circuits we connected would be complicated. Keeping the project in one piece was of high difficulty. Thus, we did need more protection on the circuits to not make it inefficient as the temperature sensor during the presentation was. What’s more, it would be show more compliance to the ideology of our project to replace the long cables with electromagnetic signals so that the circuits won’t be so effected by our movements and increase the using distance of the device. It would be even better if we could make two same machines, so that the two partners could communicate, instead of using the single-directional information we presented.  

Recitation 4: Actuators and Mechanisms

Question1: 

The art of kinetics ad robotics was not only on its productivity to replace human labor but also the creativity and talents that were long under human monopolization. As our physical world developed, we, the modern people, are in an increasing need of visual realities, which could be satisfied by the kinetics and robotics. The circuit that we built was almost nothing compared to the arts produced by the artists mention in this chapter of the book, as it had no reality-concerned disciplines, nor could it interact with users. The only similarity to be found was that the force they could offer to make itself run under computer’s order. the artifact shown in the article were of high value of art and human wisdom, and matches the argument the author tried to argued, to recreate visual reality with robotics and kinetics. 

Question2:

Devices that enble people to communicate better.

When one puts her hand on the heart beat sensor of the Killer hither, the other Killer would start reacting as that the light and the buzzer would glow and buzz with her heart beat, and she can press the buttons to choose which words to send to the other Killer, during which, if there happens to be someone with that other Killer she could react with the remote her in the same manner, so they can hear each other’s heart beat and receiving some simple messages from one another, as the warmth of company could be felt beyond words. 

Documentation:

We cut the cardboard and glued them together in sequence as instructed.

 

We ran the program on it, which we found quite boring. Thus, we started programming on our own. Consequentially we found out that the maximum value concerning the angle velocity of the machine was approximately 275. 

Midterm Project: Individual Proposal

The product that we thought of was the Loneliness Killer, so obviously the topic we focus on would be devices that enble people to communicate better. The target users would be lonely lodgers, because it sometimes happens that people would be alone when others, who rent the same accommodation, are not there, and such loneliness was so depressing, which triggered us that we should use make something to kill such loneliness. 

Stranger Things, Season 4 (Duffer, M. & Duffer, R., 2022)

When one puts her hand on the heart beat sensor of the Killer hither, the other Killer would start reacting as that the light and the buzzer would glow and buzz with her heart beat, and she can press the buttons to choose which words to send to the other Killer, during which, if there happens to be someone with that other Killer she could react with the remote her in the same manner, so they can hear each other’s heart beat and receiving some simple messages from one another, as the warmth of company could be felt beyond words. 

Group Research Step 4: REPORT

 Script:

Scene 1

Stephen:  (pretends to be crying)

Kenneth: 

* Walks pasts Stephen and sees him crying*

*stops and switches the button to the “logical” side

*changes attitude* 

Kenneth: Hey, I see you’re upset, do you want to talk about what happened?

Stephen: Yes, I can’t finish my recitation blogs. They are too long. It was like at least 800 words for each blog. I can’t take it any more. 

Kenneth: I finished mine a really long time ago. You should catch up, you’re so slow.

Stephen: you’re so mean, “switches knob for emotional side”

Kenneth: oh, I can relate. It’s all going to be fine, I can help you do it!

Stephen: thanks bro *hugs me*

Scene 2

Flora: *trips Freddie*

Freddie: *falls and cries*

Tawan: *puts the helmet on*

*switches the button to the logical side*

*changes attitude* 

“Next time you should calculate the possibility of that happening so this wouldn’t happen to you again.”

Freddie: How am I supposed to know what she’s going to do?

Tawan: “There’s a high probability that your friend would prank you, 70% to be precise. Your mental capability is just not high enough to see that coming”. 

Freddie: “You know that most people would be helping me instead of lecturing me about what to do right?”

Tawan: “Emotions are temporary. Logics are permanent. People can change emotions, but they can’t change facts. I am telling you what to do so you can think more rationally instead of crying like a baby”.

Freddie: “I don’t like you”.

Tawan: *switches the button to the emotional side*

“I’m so sorry bro I don’t know what came over me, you good?”

Freddie: “Yea I’m okay bro, thanks for asking”

Tawan: *holds out his hand to Freddie and helps him get up*

    

 Scene 3

Stephen : *puts the helmet on with the emotional mode*

Freddie: Why don’t we go out tonight?

Stephen: Yay, lez goooo! Where are we going? Clubbing?

Freddie: Yeah, but I really don’t feel like clubbing, maybe dinner?

Stephen: *switches his helmet to the logical side*

Stephen: Yes, that sounds better. We have a lot of interaction lab homework to do. It’s a Wednesday night, Freddie.

Freddie: You’re right, we should finish all of our work today. You are so logical!!

Recording & Reflection:

Our invention was the Logic/Emotion Head Band, which derived from the second article “The Ones Who Walk Away from Omelas” by Ursula K. Le Guin. We first read through the invention proposals by everyone of the group. Within the group, we voted to judge on the inventions of best interests and the ones of most practicality, namely, the ones that were most practical to be turned into prototypes with cardboards only. It turned out that  my third invention, a capsule to isolate people from the outside world was innovative but not so practical as Kenneth’s and Tawan’s second inventions, which were similar to one another, two helmets. Kenneth’s helmet was to create a virtual reality inside the helmet for users, but its functions overlapped with the already-invented VR contacts or lenses. Thus, we chose a more advanced invention, Tawan’s Logic/Emotion Head Band. In the given story, emotions in Omelas was officially defined by the knowledged and the experienced. With the band, people could have control over the emotional and logical ratios of thinking pattern, so when they need to think logically they could turn the logic switch to a large value while if they need to proceed things emotionally, to say, when they need empathies, they can turn the switch of emotion to a larger portion. The interactions of the machine stays not only on the special switches, but also, goes deeper into the neural system of human brain, the objective command of the brain’s processing pattern. It changes with the brain’s changing  import. It was not a one-direction information export, but an inter-relation, interaction. 

As was described by the inventor, “this helmet will allow you to adjust your logical thinking and your emotional thinking by adjusting a knob.” So far, no machine has been able to bring ion changes to the thinking pattern to human beings, apart from those that destroy certain parts of our brains. Whereas helmets might be too common for this project, because too many different functions could be applied to them while they are not that convenient to carry around. Thus, we decided to produce a head band, easier to carry around and easier to make. So, by using my head as a sample, they first cut out a cardboard bend to wrap around my head.

 

For it to be stable on the head when using, we need to leave space for ears on the bend so that the bend could be hooked on ears. As all of us had different size of heads, I proposed that the bend’s length could be made adjustable, which was to add a link on the fore head part to let it be flexible in length.  It was the very prototype of the prototype. To get into the details of it, we needed a blue print. Thus, Kenneth did his part to make sketches that we decided on his iPad.

Later on, we started to decide on the details of the band, how the adjusting link and the two switches should be designed to construct the whole of the band. Here is the link of the adjusting of ht size of the band;

Here is the switches that we designed. We thought of how to make these switches for long and finally we came to an agreement that we use a short stick to go through the band to stable the round switch while giving them enough freedom to turn. However, they fell off easily at first, so we have to thought of another way for them to stay. We stuck two round card boards on the other side to keep the switches on the band. It took us three days to think of it and finish it. Then, what came forth would be a rather big task, to show without telling, namely, the group performance. 

I came up with the first scenario that was to show someone crying of certain business while the user of the band came to react. What I needed to show was the change, what would happen if logic conquered the mind and what would happen if emotion filled the mind. I played the role as the crying guy. I reflected on it and realized quite a problem, at the same time I was showing the function of the band, I was also showing the weaknesses of the band, that the adjustment of the switches depends on both the user and others, but people who use it might not be able to tell how they should adjust the switches according to different situations, and might be trapped in the same thinking pattern forever. 

But we showed the bright sides of it later on with scene 2 and scene 3. Logic and emotion, empathy and discipline, they all showed their own strengths and short comings in different fields. But being able to jump between logic and emotion corresponding to situations could improve the deciding process by large. In scene 2, Tawan and Flora showed the differentiated effects of such, predicting the future and empathizing the wretches. In scene 3, I played the role of the student pleasure seeker, but with the band bring the change of thinking pattern to me, I could push myself back to the right trail, to work on academics prior to clubbing and drinking. With the band, I not only changed myself but also people around me, to say, Freddie who acted as a cross-bencher between pleasure-seeking and studying and was led to do his homework as he was supposed to. 

The group I would like to analyze and assess was the second group which made a surveillance and crime-detecting glasses device. The invention emerged from the first story around a futuristic delusional VR device which got people drowned in the very scene it created. The invention was relevant to the story, because the lenses could record what the user had seen and create a deja-vu for the user to rewatch. As the scene created was similar with the delusions created by the device in the story. It was interactive because it captured the vision of the users. Moreover, it could detect crime with certain AI algorithms under human criterion. Its interaction relies not only on the users but also the outside world and different scenarios of it. The invention was very advanced and innovative. But what’s more innovative washer the group performed. The members did a slow motion plot in the end to show how the crime was captured, which was never seen in any other peer performances. Films could do that by slowing the flow of pictures, but they actually slowed down their motions to make it happen, which stroke me as quite a surprise for immediate performance. What I found as a slight problem to the intention of the product was that the supervisory control has already been developed nowadays, and that it doesn’t seem very necessary that we need a pair of lenses to capture the things that happened. Thus, what’s the advance of the product would be the algorithms of surveillance and crime capturing. 

Recitation 3: Workout

  1. The challenge was kind of hard and we attempted to code it so that the circuit would buzz to celebrate the completion of the training, but we don’t know how to choose the node on the Arduino Uno board for buzzing. And here is the video for step 4.
  2. Code:

int SENSOR_PIN = 2;
int tiltVal;
int prevTiltVal;

void setup() {
pinMode(SENSOR_PIN, INPUT); // Set sensor pin as an INPUT pin
pinMode(ledPin, OUTPUT);
Serial.begin(9600);
}

void loop() {
// read the state of the sensor
tiltVal = digitalRead(SENSOR_PIN);
// if the tilt sensor value changed, print the new value
if (tiltVal != prevTiltVal && tiltVal == 1) {
Serial.println(“Bicep curl completed”);
Serial.println(x);
x = x+1;
}
if (tiltVal != prevTiltVal) {
prevTiltVal = tiltVal;
}
if (x == 8){
Serial.println(“Yay, you’ve done one set of curls!”);
x = 0;
}
delay(10);
}

  1. The angle for the tilt to be sensed was approximately the 60-90 degree. One serious problem we have was that when I dropped the arm the counter also stepped up by one, instead of incrementing only if I raised and dropped my arm as a set. So we have to change the code for it to count it the way we wanted, that was to include “1” and “0” as a set. One interesting fact we found was that rotating and whipping also counted.

     And when we held the wires several centimeters away and tilted it, it seems that its sensitiveness dropped and counted in a rather slow way. I think it could be used by any users for not only bicep. It could be practiced in any physical exercises that include curving of joints. 

Recitation 2: Arduino Basics

Question 1:

The rudimental mechanism of the pressing device was to judge among the two components who pressed each of the buttons in front of them ten times the fastest. Meanwhile, we are using button switches, which was basically the simple connect and disconnect of the flow. In such way, we were catering ourselves to the machine, communicating with the computer in its way. Using the analogy of from “How the Computer Sees us”(2004) by Dan O’Sullivan and Tom Igoe where desk computers perceive the users as creatures with one finger, two eyes and ears, the low level game we built might sculpture the players’ images as the one only organ, namely fingers, with merely enough strength to press buttons. Thus, to increase the computers’ perception towards human movements, so as to increase the man-machine interactivity , we could replace the button with a switch that could capture more changes proceeded by human activity. In this case the switches were the “transducers to convert between the physical energy appropriate for your project and the electrical energy used by the computer”(O’Sullivan & Igoe, 2004) and improving the transduction process could bring the game to a higher level. Hence, I propose a new switch that could intake sounds of patting as “inputs” (O’Sullivan & Igoe, 2004). As the players pat on a random hard object, the switch would hear the sound and connect the circuit it controls. Thus, with the new switches, the game turns into a race of speed-patting, that is whoever pats most quickly wins. 

Question2:

The resistor was used as a pull-up resistor or a pull-high resistor. + 5V power was  connected to give the pin a high-level current by default; Switch pressed, the current turns to low level. 

Question3:

On reading “The Arduino Way”, I found the core of its mechanism of producing electronic artifacts was efficiency. The way of Arduino was the pursue of lowest cost in time, material and money, one of the major ways of which was to compile ready-made products from others to produce your own products. A very quintessential example I could find was the artifact built by Shyu Ruey-Shiann, One Kind of Behavior. The artist was inspired “within the landscape of nature, languid movements of opening and closing of the hermit crab’s shells, a stark contrast to contemporary society where things move at high speed.”(2014) She utilized iron buckets, circuits and sensors, as well as other ready-made equipments. It detected natural movements to generate corresponding sounds with the drumming of the iron buckets. It was the an artifact of low cost while it was creative and interactive enough, which was to the spirits of the “Arduino Way”. 

Documentation:

The first two task was rather easy where I met no trouble, because all I have to do was to follow the steps, building circuits and finding the corresponding sets of codes.

Recording for Fading

IMG_2346

Recording for NEVER GONNA GIVE U UP

IMG_2348

IMG_2347

Recording for the Race

IMG_2353

In the building of the race circuit, we met trouble in building circuit. The bread board in the blueprint presented was not the of the same size of ours. Thus, we had to rebuild it based on the logical connections of different elements. It was hard, but following the left-to-right rule, we made it. 

Schematic:

Ruey-Shiann, Shyu. “One Kind of Behavior”(2014). hello circuits. https://hellocircuits.com/2014/08/03/one-kind-of-behavior-by-shyu-ruey-shiann/

How I define interaction

For me, interaction was the perception and execution of indications from one another. The closest analogy to it would be the a dialogue. By understanding the object’s words, the subject gives corresponding reposes and vice versa. In short, as Crawford proposed, it should be as “a cyclic process in which two actors alternately listen, think, and speak.” (1) So in the case of human-machine interaction, unlike common understanding, I consider it as that people, after having the machine proceeding their indications should respond in an understandable way for machine. Even if it was the end of conversation, we, the human, would wave each other goodbye. Thus, to the machines, we should do the same, presenting them with the rudimental respect by responding in a comprehensive manner. Sadly, a negative example against my definition was what we made in the recitation class, the circuit accepting one-off commands from human, which is shown hereIMG_2285, though I have to admit I have defined interaction upon very much of a narrow sense.

Whereas our development in technologies, to say, new media, developed, the definition of man-machine interaction have a rather obvious tendency towards Crawford’s theory as well as mine. A decade ago, the definitions might still be out of reach in certain fields, but nowadays, we are able to satisfy the standards. Manovich, in The Language of New Media,  also pointed out sci-tech’s intermittent impact on man-machine relation, which were that “a new media object is subject to algorithmic manipulation”(27), as new media could be mathematically represented(27). He put forward the quintessential case where the media access was actually subjected to increasing automation, which resulted in the growing comprehensive ability of machines towards human activities(34), say the NLP(Nature Language Processing) used for searching engines. As technologies developed, the import and export of the machines developed, which could be seen from the artifact “We Spoke of the Same Bright Moon”(《我们共诉一轮明月》, https://www.manamana.net/video/detail?id=1930184#!zh).  It was a visual reality game where a system was created as another world in media, a poetic craft from the authors imaginations. Everything that should react as in the real life, react the same way they should react in in this world existing only in digital dimension. The water waves as the users steps on; the guiding gold fish goes the way you go; the light changes from different degree and different time. (临Lyn, n.d.)

Moreover, they even included specific guidance for each step floating at front as the users proceeds. Certain moves of the user might trigger the guidances to emerge. The machine, or more specifically, the program of the game, understands the moves made by human and could react to them in a understandable, thanks to certain principles from the physical world that the game followed, while human, on receiving the emerging information, say, the guidances, react also in a understandable way for the gamete program. The case can’t explain my definition of interaction better. 

To sum up, interaction consists of the triggering move by the subject, and plural responses from both sides, the subject and the object, as well as, the stress-worthy ending indication from either side. 

Group Search Project: Read

  1. The Veldt:

In the fiction, George and Lydia, as a couple with three children, had drown themselves in a manually-automatic world, where labour was done by robots and delusions created by machines replaced physical traveling. (Bradbury, 1950) Thus, they decided to press themselves into real life again(Bradbury, 1950) , I considered which was caused by the lack of physical contacts with real-life matter. What impressed me most were those delusional screenage, consisting of the veldt, the heat, the lions and even forests. And the closest products, in visual electronic device industries nowadays are the VR contacts or lenses. They both indicated that in the future, people will be living in a fake world whose emphasis would focus on visual productions. Thus, to make the world more fake or, should I say close to the real world, I want to propose a new supplement product for those VR equipment mentioned above, which is the “Real Gloves”, applying certain feelings to hand corresponding to the visual scenes in the visual devices and sound devices. The problem expected from this product would be that people would consequently be wholly drowned in the world of the fake, because as physical feelings attend the visual arts, the visual world would finally not be able to be separated from the real world. 

2. “The Ones Who Walk Away from Omelas”:

In this fiction, Le Guin described an imagined city Omelas, where people’s values and emotions were strictly defined within common moral standards. Thus, individuals ought to be defined by others to be happy, instead of being happy as the way they felt. Thus, I thought of an air projector used for communications as a node for social platform like instagram or WeChat. In this device, people can create their own image to be shown to others and say whatever they want without limit. Before it starts projecting, it will emit some fogs and smokes  in the air for reflecting lights from the projector and create a physical communicational scene in front of users. With more people joining a conversation, the images correspondingly increase. In such way, people socialize in any images they want. Meanwhile, as they can say anything they want, there isn’t any standards for any one’s emotions or mindsets. It’s a world with no limitation. The closest product or concept to it would be the Metaverse. It was another cyber world where people say whatever they want and dress in whatever they like with minor limits from any cultural ,religion or physical limitations. I would define my product as another manner realization of the concept of Meta verse as most popular way to satisfy it nowadays was through VR. The problem of such platform would be that it might provide an environment for the porn industry and other yellow industries to thrive. As this platform was no limitations. 

3. The Plague:

In this fiction, the main character worked as a “corpse” cleaner who worked for 2 years to clean up infected people seen as corpse. The infectious virus was silicon-based so people infected would gradually turn into stone, but even if they turned into stones, they weren’t dead. Instead, they proceed normal life operations in slow motion as a different form of life. And as the virus spread further, the whole human society changed into the world the creatures of this form. The slow movement slow movements of the new forms of life stroke me as hard that our life here in Shanghai is so rushed and intense. What if we could put ourselves in a space where we could be totally cut off from the outside world and enjoy our life in a comfy pace. Hence, I suggest the Moving Space. It was a large bag of 2 layers of plastic, between which was an interlayer vacuumed by the vacuuming machine at the bottom. Therefore, when the bag was pumped up, people inside will hear no sound from the tense world outside. Also, the inside of the bag could be paint the way the user likes so they also see nothing but an environment cater to them. So far, all I could found that are of similar functions would be the noise cancelling ear buds or hotel bed rooms. As they, to certain degree, provided an environment partially cut off from the familiar world outside. There was a serious problem still. There might be situations like emergencies, to say natural disasters in the environment where people use the Moving Space. Danger could be directly happening to the users in such situations. 

Stephen’s Blog for Recitation 1: Electronics & Soldering

(The videos are in those short Links starting with IMG. Click & Find)

Before we proceeded the construction of the circuits, we tried to understood the rudimental structure of the bread board that was distributed to us beforehand. With the help of both professors and the assistants, we caught hold of that the board was symmetrically build while each side was made of 2 parts, dividing by a blue line, of which, one was for the plug-in for the rudimental power supply and the other was for the build-up of the circuit. 

Then came the fist task, to build the simplest circuits for the buzz to buzz in human control. Simple as it is, it didn’t sweat us to know how we should cascade the whole circuit. Whereas, after finishing the circuit, the buzz did buzzed, but was out of control, because it could buzz forever with power turned on, however hard we placed our fingers on the switch. Thus, we called the assistance to come   and she explained to us how the switch built. Basically, there were four legs on it, the upper parts of the were connect to one another in a intersecting way. So according to this special structure, we rebuild the circuit and manage to tell the buzz to buzz whenever we wanted.

IMG_2285

On completing the first task, the second task was rather easy. As was shown by the abstractive map of the circuit, we needed to add a parallel connection to the circuit from the first task. There were two fatal details required our extra attention. First, the head of the added loop should be connected before the switch to the power supply. Second, the LED would only work if the head and tail of it were in the correct sequence. The longer leg, the head, to the positive pole while the shorter leg, the tail to the negative pole. Thus, the second task was done. 

Task 3 was comparably difficult as it has the most complicated circuits with three route of flow in parallel. The difference between the third connection added to the second one was that it had a potentiometer between the LED and the usual resistor. By turning the screw on the potentiometer, the resistance of the whole trail changed, which resulted in oscillation of the luminance of the LED. As the second LED and the buzzer started from the same node, the light shone as the buzzes screamed, while the third LED was always on for changing. Moreover, we built an even more special equipment to replace the switch, an Aluminum clip. By pressing, we could simply control the circuit. 

IMG_2286

Question 1

The resistance of the the potentiometer, freshly plugged in, was unknown while the LED do need a resistor to lower the current of the circuits. R1 was acting as an insurance for the LED not to get burnt out.

Question 2

Question3

I had to press the button on the switch to keep the buzz buzzing and that through the switching of the nut on the potentiometer, the illuminance of the new LED in Task 3 was able to be changed. I consider it as low level interactivity. From the degree of Boolean Property, I think it belongs to human-appliance singular-direction interaction where human have over all control of the circuit while the circuit responded as they were told to and contracted to, instead of catching the movements from human body. 

Question 4

Interactive Design have managed to create the basic blue print of the art work, by finding out what human organs the electrical devices the need to catch the movements of and what types of devices that can fully respond and show the intention of the design and how much space or freedom that can be given to artist of specific fields to manage their expressions of their art works. Whereas, the Physical computing was also of great significance, because it catch hold of the import and the export of interactive informations, during the procedures of of both of which, the accuracy is highly required. Lieberman in his speech told the story of the a graffiti artist who was paralyzed but successively managed to precisely put forward his work by eyes movement and there are cases where people can move the things they drew to their wills. (2010) Interactive media art need not only the unlimited imaginative ideas of great artistic minds but also the very precise capturing and expressing of movements from Physical Computing.