Recitation 8: Serial Communication by Nathalie White

During the recitation, I was only able to complete the first exercise. I will attempt the second in my own time. 

Here is the code I used in Processing:

// For receiving multiple values from Arduino to Processing
int previousx;
int previousy;

import processing.serial.*;

String myString = null;
Serial myPort;


int NUM_OF_VALUES = 2;   
int[] sensorValues;     

void setup() {
  size(500, 500);
  background(255);
  setupSerial();
}


void draw() {
  updateSerial();
  printArray(sensorValues);
 

 //ellipse (sensorValues[0],sensorValues[1],100,100);
 
 stroke(0);
 line(previousx,previousy, sensorValues[0],sensorValues[1]);
 
 previousx= sensorValues[0];
 previousy= sensorValues[1];
 
}

void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 2 ], 9600);
  
  myPort.clear();
  
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}

Here is the code I used in Arduino:

// the setup routine runs once when you press reset:
void setup() {
// initialize serial communication at 9600 bits per second:
Serial.begin(9600);
}

// the loop routine runs over and over again forever:
void loop() {
// read the input on analog pin 0:
int sensorValue1 = analogRead(A0);
int sensorValue2 = analogRead(A1);

// print out the value you read:
Serial.print(map(sensorValue1,0, 1023, 0,500));
Serial.print(“,”);
Serial.print(map(sensorValue2,0,1023,0,500));
Serial.println();

delay(1); // delay in between reads for stability
}

This is the result:

It was not very precise. Very hard to draw with. 

Recitation 8 Documentation

Exercise 1: Make a Processing Etch A Sketch

 For exercise 1, we were to use serial communication within Arduino and Processing to send two analog values from Arduino to Processing, first building a circuit with two potentiometers that would read the x and y values of the Etch A Sketch. One of the potentiometers would have to read the “x” values, while the other potentiometer would read the “y” value, in order to create the drawing by turning the potentiometers. Utilizing the serial_multipleValues_AtoP file from class, I modified the code slightly so as to make sure the mapped values would correspond to the correct potentiometers. The circuit was fairly easy to assemble, with the most challenging part being the connection between Arduino and Processing. I had trouble getting the line to be stable, but discovered it was due to loose wiring between the potentiometers on the breadboard.

Arduino Code:

Processing Code:

import processing.serial.*;

int NUM_OF_VALUES = 2;
int[] sensorValues;

int prevX;
int prevY;

String myString = null;
Serial myPort;

void setup() {
size(800,800);
background(0);
setupSerial();
}

void draw() {
updateSerial();

printArray(sensorValues);
stroke(250,250,250);
line(prevX, prevY,sensorValues[0],sensorValues[1]);
prevY = sensorValues[1];
prevX = sensorValues[0];
}

void setupSerial() {
printArray(Serial.list());
myPort = new Serial(this, Serial.list()[4], 9600);
// WARNING!
// You will definitely get an error here.
// Change the PORT_INDEX to 0 and try running it again.
// And then, check the list of the ports,
// find the port ā€œ/dev/cu.usbmodemā€”-ā€ or ā€œ/dev/tty.usbmodemā€”-ā€
// and replace PORT_INDEX above with the index number of the port.

myPort.clear();
// Throw out the first reading,
// in case we started reading in the middle of a string from the sender.
myString = myPort.readStringUntil( 10 ); // 10 = ā€˜\nā€™ Linefeed in ASCII
myString = null;

sensorValues = new int[NUM_OF_VALUES];
}

void updateSerial() {
while (myPort.available() > 0) {
myString = myPort.readStringUntil( 10 ); // 10 = ā€˜\nā€™ Linefeed in ASCII
if (myString != null) {
String[] serialInArray = split(trim(myString), ",");
if (serialInArray.length == NUM_OF_VALUES) {
for (int i=0; i<serialInArray.length; i++) {
sensorValues[i] = int(serialInArray[i]);
}
}
}
}
}

Exercise 2: Make a musical instrument with Arduino

For exercise 2, we were to send values from Processing to Arduino using a buzzer that would play a sound if pressed. We were to map the position of the mouse in respect to the buzzer’s tone. We can use the multiple values from Processing to Arduino for our code, modifying for the duration and frequency of the values. 

import processing.serial.*;

Serial myPort;
int valueFromArduino;

int High;
int Med;
int Low;

void setup() {
  size(500, 500);
  background(0);

  printArray(Serial.list());
  // this prints out the list of all available serial ports on your computer.

  myPort = new Serial(this, Serial.list()[4], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.
}


void draw() {
  // to send a value to the Arduino
  High = height;
  Med = 2*height/3;
  Low = height/3;
  if (mousePressed && mouseY > 0 && mouseY < Low) {
    myPort.write('L');
  } else if (mousePressed && mouseY > Low && mouseY < Med) {
    myPort.write('M');
  } else if (mousePressed && mouseY > Med && mouseY < High) {
    myPort.write('H');
  } else {
    myPort.write('N');
  }
  //if (mouseX > width/2) {
  //  myPort.write('H');
  //} else {
  //  myPort.write('L');
  //}
}

 

 

Recitatioin 8–Tao Wen

Exercise one:

Regarding coding, there’s one aspect to notice: the map function. Otherwise, the circle will go out of the frame.

Exercise one- Etch A Sketch:

The difference between this and the last step is that this one uses “line” function. I tried to use tiny little ellipse to consist a line, but the fact is that not the whole trajectory would be shown on canvas. And when using line, it’s not enough just to connect(x1,y1) and (x2,y2), otherwise nothing would be drawn.  You have to use the previous coordinate, just like what we do with pmouseX and pmouseY.

Concerning the interaction experience, I don’t like it at all, probably because I’m not used to drawing by thinking about its horizontal and vertical coordiantes. However, I think it’s a useful way of algorithmic thinking when designing drawing-related projects.

Code that matters(Arduino):

void setup() {
  Serial.begin(9600);
}
 
void loop() {
  int sensor1 = analogRead(A0);
  int sensor2 = analogRead(A1);
  int mapped1= map(sensor1,0,1023,0,500);
   int mapped2= map(sensor2,0,1023,0,500);
  Serial.print(mapped1);
  Serial.print(“,”);
  Serial.print(mapped2);
  Serial.println();
}

Code that matters(Processing):

void draw() {
  updateSerial();
  printArray(sensorValues);
 
println ("sensor1=",sensorValues[0]);
println ("sensor2=",sensorValues[1]);

line(p1,p2,sensorValues[0],sensorValues[1]);
p1=sensorValues[0];
p2=sensorValues[1];
}

Exercise two:

The idea is to create a piano key board, which requires two variables: the on-and-off option and the tone, corresponded by mouse press and mouseX. Since I wasn’t familiar with coding, the interaction experience is not ideal. The five sounds, put together, are so disturbing to play. I would like to expand on this idea: the user can choose a music style (e.g Japanese, Chinese pentatonic, Arabian), the tone of keys will switch accordingly, and then the user (ideally amateurs) can explore and create their own melodies.

Code that mattersļ¼ˆProcessingļ¼‰

void draw() {
background(0);
values[0] = pmouseX;
if (mousePressed){
values[1]=1;
}else{
values[1]=0;
}
printArray(values);
sendSerialData();
echoSerialData(200);
}

Code that matters (Arduino):

int melody[] = {
  262, 349, 196, 440, 4186, 2093, 3136, 175
};
int freq;
 
void setup() {
  Serial.begin(9600);
  pinMode(13, OUTPUT);
}
 
void loop() {
  getSerialData();
 
freq= melody[int(map(values[0],0,500,0,8))];
if (values[1] == 1) {
    tone(13, freq);
  } else {
    noTone(13);
  }
}

Final Project: Essay by Isabel Brack

Fragmented Faces

Our project is titles “Fragmented Faces” and aims to tackle the complex issue of identity and humans’ connections with others.

 PROJECT STATEMENT OF PURPOSE 

Our project aims to demonstrate the complexity of human identity and how identity can be represented through faces, emotions, and expressions. It also aims to show how complex and disorrienting understanding identity is when a person interacts with more and more people. The project entails using different images of peoplesā€™ faces and cutting the images into three horizontal sections, and then randomizing the different pieces to create different faces out of many mouths, eyes, and noses. 

Our audience can be generalized to entail everyone as it is focused on inclusion and understanding the identities around us, but the audience can also be a more focused group of people who are trying to understand the identities of people who surround them and who are putting effort into understanding the complexity of identity. This project was mostly inspired by a few art pieces one video of an interactive simulation about identity crises and the complexity of self identity and the other focusing on the fragmentation and complexity of identity.

fragmented facecompilation face

Inspirational identity artwork

 PROJECT PLAN 

This project aims to create a dialogue and questions about identity and the connections people make based on identity, which are represented by each light-up pressure pad with a hand silhouette showing the connection. The simulation or game entails the user standing in front of the processing screen and 5-6 pressure pads each with a hand print that lights up. The different handprints will light up in a sequence that gets faster and faster and people will try to keep up with the lights, pressing their hand against the light up hand. Each time a button is pressed the faces will randomize. At first, when the simulation is easy to follow it will be easy to connect with each hand and see each identity change and create a new face, but as the simulation gets faster and more complex people will be unable to keep up with each individual interaction and the identities appearing on the screen. Once the simulation becomes impossibly complex all the hands will blink in unison and then turn off except the center hand which will remain lit and once the user presses it all the individual original images of the faces will appear together on the screen.

To complete this project we first will start with the processing and create a simple random face generator with one button and different image. I have already created the base code for that which has three images of faces correctly proportioned but is not randomized to include multiple options yet. After we create the processing basic code for random images, we will then add multiple buttons to control the randomization of the faces. This will be the prototype for the simulation. Next we will create the LED light flashing separately and simultaneously in Arduino and after figuring the two pieces out separately we will combine the codes and add LEDs to the buttons, which we will create out of cardboard or wood and plastic so the LEDs can show the hands lighting up. We would like to finish the basic processing code for the random faces this weekend. And add the signal communication and LED sequence by the end of next week. After the LEDs and the processing animation controlled by buttons are combined the work will mostly be focused on creating the environment and set up of the game, including the hand buttons.

face randomizer code

The work-in-progress code for assembling random faces in processing.

light up hands

Examples for the light up had prints that will also serve as buttons/pressure plates to cause the faces to change.

CONTEXT AND SIGNIFICANCE 

Originally I researched projects on light and sound designs including a storytelling animated wall. This project in general led me to think about the animation in processing which has gone through different variation but ultimately involve fragmented piece coming together to create a whole. Originally, I wanted to animate a process that I saw artist do still versions of peopleā€™s faces breaking into small pixel like units and the art piece becomes more and more abstract. But after talking it over, an easier and more straight forward animation to represent this fragmentation creating a whole would be to use three horizontal panels to create a whole image. Our project was more influenced by non-interactive art pieces that represented identity as it was harder to find interactive art exhibits that focus on issues of identity of human connection. Our project was also in part influenced by the Piano Dancer project proposal I suggested as our user interface will be quite similar with light up hand prints acting as buttons, similar to the light up piano keys. This project follows our expanded definition of interaction beyond the simple understanding of two actors communicating through input, processing, and output as discussed in What Exactly is Interactivity?. Interaction involves the entire experience including people thinking and possibly creating a dialogue about a project. As I researched interactive art is also based on ideas like TATE, interactive art was also a method for the artists to make connections with the environment and their audience, enhancing both the interactive elements coming from the audience and from the machine/art piece. Each art piece is created by an artist, who has an intention for their piece and the audience, whether that is completely understood and achieved or not. Combining all the definitions of interaction researched, a successful interactive experience consists of two actors who communicate using a series of inputs, processing, and output; however, the overall experience created enhances the interaction, including forcing the audience to think differently and more specifically about an issue or thoughtful concept. Every interactive experience exist at a different amount of interaction. Some interactions are simple interactions between two parties that meet the basic requirements of input, processing, and output, and some interactions are closer to human interactions that are always changing and evolving responding to each actorā€™s last action, including physical but also complex mental interaction. With our midterm project and now also the final project ā€œFragmented Facesā€ we hope to push our audienceā€™s interaction to involve a more complex level of interaction, including contemplation of the work and the meaning of it, to hopefully create a dialogue or through process about what identity means and how its easy to stop understanding othersā€™ identities the more interactions and connections you are having with people. The confusion of the hands and light/button sequence represents the chaos of understanding identity. Hopefully, this projectā€™s uniqueness will come from creating a new dialogue or prompting people to think about identity and connections they make with people. Although the face swap and randomizer has been done before in other projects, we have specified out focus around what identity is and what it means to connect with people. We added the simulation/game of light up hands to create a more complex and ever changing interaction between the audience and the project. Many of the projects we looked at in researching identity artwork showed the fragmentation and confusion of self identity and other, and much of identity artwork is left much up to interpretation. Our project creates and interactive and changing interface for people to engage with a changing and confusing identity, building off of their artwork around identity. I especially appreciated the chaos and complexity of self identity represented in this simulation (mentioned and linked above too). Our main goal with this completed project is to spark either physical conversation and dialogue or internal thoughts and contemplation over what identity means to different people and with how chaotic life is and how many people one meets during life, they donā€™t truly understand the identity of most people because we donā€™t take the time to think about each individual. This project could in the future be built upon to understand and complicate the concept of self identity if the project also involved capturing peopleā€™s faces and including them in the random face generator. Particularly, our message fits with an audience of people who are interested in exploring the concept of identity and understanding the people around them, to better be able to talk and interact with others in the future. This goal especially fit with the mindset of many NYUSH students who come to this school to make connections and meet people from all over the world with very different backgrounds. But the longer these students become complacent talking to the same people from the same backgrounds the less they think about people from different backgrounds with different identities.

final project essay —Katie

  1. PROJECT TITLE

Reconsider Human-Nature Relationship

PROJECT STATEMENT OF PURPOSE *

This project aims at addressing the question of ā€œhuman-natureā€ relationship. The insights weā€™ve found are first an art work in our schoolā€™s gallery called the 72-relations with a goldenrod. And another internet interactive art project called ā€œway to goā€. These two works narrow our idea and make it tangible. Our project provides experience for users to explore different ways that one can interact with nature and then reflect back on our own behavior in daily life.

PROJECT PLAN *

Our project is a screen-based VR experience. Users interact with multiple sensors physically and it results in different visuals on the screen. We will gather our shooting footage by Nov.23 and edit it by Nov.24. On Nov.25-27, we will start to figure out the code in Arduino and processing with the help of professors and fellows as well as online tutorials. We need to know first, what kinds of sensors we need in terms of ā€œrelation between natureā€. Second, how to add animations onto a pre-shot video. Third, how to switch between different scene weā€™ve shot. On Nov.28-30, the thanksgiving holiday, as we already have a basic structure in our code, we will try to modify it on our own. By Dec. 2, we will manage to run through the whole project, make the Arduino and processing communicating with each other. For the rest of the week, we need to design for which part we want to use fabrication tools to help us as well as user testing and further improvements.

CONTEXT AND SIGNIFICANCE *

In our preparatory research and analysis, we define interaction as a kind of conversation where two or more actors are involved. They listen (receive the information), think (process the information) and speak (give out the processed information). In the projects weā€™ve looked at, two things inspired us. First is the how to trigger userā€™s motivation to interact. We found out that users are more possible to interact when they see themselves in the screen (for example, using mirrors). So, we decided in the transition of different scene, the users can see themselves by involving web cam in our project. The second thing is that when thereā€™s more people interacting together, it gets more fun. So, with different sensors, our project lets multiple users to explore together. after successful completion, our project leads the users to think about their relations with nature and to reflect on their everyday behavior. The idea of this project is inspired by our humanities course which the topic is ā€œthe question of the Anthropoceneā€. We begin to think about how we can better coexist with nature, to keep the harmony between human and the environment. I guess this is the biggest significance of our project. This project aims at users of all ages, because the question of human and nature is a question that involves all of us.