Final Blog Post of Interaction Lab 侬说啥?Sarah Armstrong – Marcela

侬说啥?

CONCEPTION AND DESIGN:

 In the original plan, there wasn’t going to be a puzzle, but when the idea of focusing on the kid point of the project was brought in the class discussion, I think that was a key turning point for how this project ended up being designed. 

Going into the class discussion, I wasn’t sure how I was going to make this project so interactive, but the idea of kids playing with it made total sense to me. When I was a kid, I went to my local Boys & Girls Club and then eventually started working there in the summers and after school. Through this background in making programs that kids enjoy, I knew exactly what to design to catch the eyes of kids. When designing, I wanted all of the pictures to be bright and colorful, and the character on the screen that they could move around themselves is supposed to be someone they know (It’s the character Russel from Up). 

The use of the puzzle was also another really good idea brought up in the class discussion because it was a good way to keep kids engaged. I think that if we had just used a screen, kids would not have wanted to play with the project as much as they did at the final IMA show.

FABRICATION AND PRODUCTION:

 I think the most significant steps in the production process were making sure that people can’t move once they start listening to one province’s dialect and the creation of our own “buttons” by using the capacitive sensor. I think making sure that people couldn’t move once they started listening to a province was helpful to contribute to the significance of this project as it helped the message that these dialects are often forgotten or looked over, so forcing people to pay attention was the opposite of what is normally done. I also think creating our own buttons was an important step because it used the natural electricity from the human body to trigger the translation on screen.

From user testing, we learned to use headphones instead of speakers because the environment was too noisy for anyone to actually listen. In addition, we learned that the diagonal motion of the joystick had not been programmed in, so it could only move from side to side or up and down. We also learned that Shanghai is too small for Russel to fit in, so there needs to be some way to trigger the sound once Russel crosses over. 

      

CONCLUSIONS: 

I think this project was able to fit my definition of interaction because it not only involved a human user but also the interaction between Arduino and Processing in order for this to be a success. I think this doesn’t align with my definition of interaction because most of the time the human user is not actually interacting with the Processing part, they are using the Arduino to interact with the Processing, there is no direct contact. Ultimately, my audience definitely interacted with my project as it caught their eye and kept them engaged. If I had more time, I would make sure that the puzzle piece could be wither put in or picked up and out for the translations to be triggered, because I couldn’t get it working the way I wanted it to, and ultimately had to use the capacitive sensor. I really enjoyed getting the freedom to work with my hands and do my project on anything I wanted to. So many people asked me if I was a Global China Studies major when they saw my project, but no, it was just one thing I was interested in, I don’t necessarily have to be that major to have an interest in it. 

I think this project was significant because a lot of the Chinese people who interacted with this project really thought deeply and reflected on their knowledge of their dialect or from one parent why they felt like they needed to not teach their children the dialect of their hometown. I also think it was interesting to hear the stories of my friends from their childhood with memories they cherished with points that made it feel like home. For me, it was also very weird to hear such stark differences between the dialects since most of the time, the only time I compare Chinese is between Mandarin and the dialect. Through this comparison, I now understood why the Chinese government wanted to push the teaching of one unified language as they are so different. 

Photos from the IMA Show

   

Translations and Recordings

             

Link to recordings because I couldn’t upload audio: Recordings

Workshops

For this recitation, I chose to go to Young’s workshop on Serial Communication because I wanted to get better at it since the communication between Arduino and Processing is integral for my final project’s success. 

Young did not have a formal exercise for this workshop, so I used his workshop advice and knowledge to contribute to my final project. Here is the code with the mapping function as described in Jessica and Eszter’s helpful workshop and the Serial Communication that was used in my final project. This is only the Arduino part as the Processing part is only the basic Arduino to Processing serial communication that was given as an example. 

  int xDirection = analogRead(X_pin);
  int yDirection = analogRead(Y_pin);
  Serial.println(xDirection);
  Serial.println(yDirection);
  xDirection = map(xDirection, 263, 759, 0, 10);
  yDirection = map(yDirection, 261, 761, 0, 10);

Media Controller

Recitation

// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
PImage photo;

String myString = null;
Serial myPort;


int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/


void setup() {
  size(800,500);
  background(0);
  setupSerial();
  photo = loadImage("May.jpg");
}


void draw() {
  updateSerial();
  printArray(sensorValues);
  image(photo, 0, 0, width, height);
  //ellipse(sensorValues[0], sensorValues[1], 100,100);
  // use the values like this!
  // sensorValues[0] 

  // add your code

  //
}



void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
PImage photo;

String myString = null;
Serial myPort;


int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/


void setup() {
  size(800,500);
  background(0);
  setupSerial();
  photo = loadImage("May.jpg");
}


void draw() {
  updateSerial();
  printArray(sensorValues);
  image(photo, 0, 0, width, height);
  //ellipse(sensorValues[0], sensorValues[1], 100,100);
  // use the values like this!
  // sensorValues[0] 

  // add your code

  //
}



void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Reflection

I think my project was a low-level version of anything they discussed in Computer Vision for Artists and Designers. I feel like nothing compared to that because I just did one simple thing with an image. I think I should have done something with Capture if I wanted to get close to what they were doing.

Serial Communication

Recitation


// IMA NYU Shanghai
// Interaction Lab
// For sending multiple values from Arduino to Processing


void setup() {
  Serial.begin(9600);
}

void loop() {
  int sensor1 = analogRead(A0);
  int sensor2 = digitalRead(13);
  /*int sensor2 = analogRead(A1);
  int sensor3 = analogRead(A2);*/

  // keep this format
  Serial.println(sensor1);
  Serial.println(sensor2);
  /*
  Serial.print(",");  // put comma between sensor values
  Serial.print(sensor2);
  Serial.print(",");
  Serial.print(sensor3);
  Serial.println(); // add linefeed after sending the last sensor value
  */
  // too fast communication might cause some latency in Processing
  // this delay resolves the issue.
  delay(100);
}

// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
PImage photo;

String myString = null;
Serial myPort;


int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/


void setup() {
  size(800,500);
  background(0);
  setupSerial();
  photo = loadImage("May.jpg");
}


void draw() {
  updateSerial();
  printArray(sensorValues);
  image(photo, 0, 0, width, height);
  //ellipse(sensorValues[0], sensorValues[1], 100,100);
  // use the values like this!
  // sensorValues[0] 

  // add your code

  //
}



void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Reflection

Before I had only worked with Processing or Arduino separately, so this was an interesting level of interaction between the two programs that was very different than what I was used to. Most of the time, the only hardware I could use with Processing was the keyboard or the mouse, but in this case, I was able to use other things which was pretty cool. 

Final Project: Final Blog Post (December 18, 2019) – Jackson McQueeney

Vocal Art – Jackson McQueeney – Marcela Godoy

CONCEPTION AND DESIGN:

After discussing our six ideas, my partner and I decided that “Vocal Art” would not only be the most realistic project to execute, but it would also best fit our definitions of interaction. The conception of this project was simple, especially in the way we foresaw users interacting with it. Since the main component of this project was a microphone, we decided that the physical aspect of the project should be an intuitively-designed microphone.

Additionally, we decided to add three potentiometers—each controlling one RGB color value—to add another layer of interactivity. Since the overarching project idea was rather simple, it had a rather simple design. We housed the microphone and potentiometers in a laser-cut wooden box, with basic instructions and its function etched into the box. We tried to connect my partner’s computer to a large TV screen in order to display the output to a large audience, but due to technical difficulties, we were not able to achieve this. At one point, we affixed a cardboard cone to the microphone in an attempt to better receive sound input, but the cone covered the instructions, so we ultimately removed it. One suggestion made after we finished and presented the project was to put colored caps on the potentiometers, each corresponding to the red, green, and blue values that they controlled, but since this suggestion was made after our final presentation, we could not implement it.


Potentiometers and buttons soldered for the circuit


Complete circuit before being housed


Two angles of the Arduino circuit housed in the box

Top of the wooden housing (Showing instructions, buttons, and potentiometers)

Video of the Processing code running after multiple microphone inputs

FABRICATION AND PRODUCTION:

After our initial conception and design phase, we built our project. The most significant steps of the process were constructing the Arduino circuit, writing the code for Arduino and Processing, and using digital fabrication to make components of our project. In the latter step, I believe the project could have been improved by elevating the microphone by fabricating a more appropriate housing and by including colored caps on the potentiometers.

Although I could not be present for the user testing session, my partner relayed to me some of the feedback that the project received. Two major aspects of the project were developed following this feedback, specifically placement of the output circles on the Processing screen and the use of potentiometers to control RGB values. From the feedback, we decided to make the placement of the circles random, and we incorporated potentiometers to control their RGB values. I believe that these changes were effective, since they added more dimensions of interactivity to the project, while keeping the overall concept simple and concise. 

CONCLUSIONS:

The broad goal of this project was to facilitate interaction between multiple users and the project. Specifically, the goal was to translate the users’ voices into a visual output that preserved outputs from previous uses after each subsequent use. Through this course and an ever-evolving definition of interaction, I currently define interaction as: “the communication between two or more organic or machine actors, facilitated by an interchange of inputs and outputs between actors”.

I believe that our project aligns with this definition in that it required the inputs of many human actors to achieve its full goal/effect in the form of its output in Processing. This translation from input to output was facilitated by the sound sensor in Arduino. This project does not align with my definition in that the inputs and outputs were not necessarily interchangeable, but I think that is a problem with my definition rather than a problem of the interactivity of the project. Based on this definitions and my expectation of the audience’s response, I believe that my audience interacted with the project in that they provided an input in the form of their voice, and the project generated an output based on the variable volume of each individual user, and I think that this form of interaction was exactly what my partner and I expected.

Based on audience feedback after our final presentation, I believe a few ergonomic/intuitive improvements could be made to the physical design of the project, had we more time. These include using a more sensitive microphone to pick up more nuanced volume differences between each user, elevating the box so that users do not have to bend over to speak into the microphone, using a different LED so that the color of the LED more accurately represented the color of the generated circles, and using colored caps for the potentiometers so that users could more easily determine their function.

One idea that I thought would be interesting (though difficult) to attempt would be to implement some sort of method to determine the user’s mood, and base the output on that rather than volume. Though this might be too advanced for our current ability, I think that our project could be improved by changing outputs based on pitch as well as volume, since pitch is easier to determine than mood.  With the completion of this project and course, I feel more confident in my coding abilities and my circuit-building abilities using Arduino. I think the most consequential outcome of this project was its simplicity. Many other projects, both during midterms and finals, had some big overarching moral message or attempted to solve some major global issue, ultimately losing some aspect of its interactivity. However, I believe our project was successful because it focused on the interactivity aspect.