Recitation 9: Media Controller (Katie)

I just went to my favorite singer’s concert  and I thought the lighting in the concert is very suitable for the tint function. I set the blue and green value to random and use one potentiometer to control the red value. An important thing is to map the values from the potentiometer to  (0,255).

This is the final result:

This is my processing code:

import processing.serial.*;
import processing.video.*;
Movie myMovie;

String myString = null;
Serial myPort;


int NUM_OF_VALUES = 1;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;      /** this array stores values from Arduino **/


void setup() {
  size(1280, 1080);
  //myMovie = new Movie(this, "dancing.mp4");
  myMovie = new Movie(this, "hcy.MOV");
  myMovie.play();
  setupSerial();
  
}


void draw() {
  updateSerial();
  printArray(sensorValues);

float hcy = map (sensorValues[0],0, 1023, 0,255);

  // use the values like this!
  // sensorValues[0] 

  // add your code
if (myMovie.available()) {
    myMovie.read();
  }
  tint(hcy, random(255), random(255)); 
  image(myMovie, 0, 0);
  //
}



void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 1 ], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}





this is my Arduino code

[code] // IMA NYU Shanghai
// Interaction Lab
// For sending multiple values from Arduino to Processing

void setup() {
Serial.begin(9600);
}

void loop() {
int sensor1 = analogRead(A0);
//int sensor2 = analogRead(A1);
//int sensor3 = analogRead(A2);

// keep this format
Serial.print(sensor1);
//Serial.print(“,”); // put comma between sensor values
//Serial.print(sensor2);
//Serial.print(“,”);
//Serial.print(sensor3);
Serial.println(); // add linefeed after sending the last sensor value

// too fast communication might cause some latency in Processing
// this delay resolves the issue.
delay(100);
}
[/code]

Reflection about the ways technology was used in your project:

After reading the texts, I realized that technology is just one approach to convey my meaning in the project, to enhance the interactive experience. technology itself is not the main part of the project, the most important things are first, the message behind. Second, the user’s experience. 

In the text, it introduced Rafael Lozano­ Hemmer’s installation Standards and Double Standards (2004). I like this project because first it involves full-body movement. Second, it’s thought-provoking. Are those belts a representative of humans? The first time encounter it, I was not impressed by how fancy technology is used in this installation but to think about the deeper meaning behind this project.

In our final project, we also want users to have full-body interaction so that we plan to use multiple sensors. Then, we want users to think about the relations between human and nature while experiencing this project.

Recitation 9: Controlling Media by ChangZhen from Inmi’s Session

Preview of project:

Player draws a picture by controlling potential meters. Arduino analogReads the two potentials and serially communicates them to Processing to be used as X and Y coordinates. Processing loads image, JuHunKim.jpg, gets color with random coordinate variations from and around (X, Y) of the image, and draws circle faces of random size with the color it just got respectively onto the canvas my project newly sets at the corresponding coordinates. Moreover, player can only draw the left half of the canvas since the right half will be mirrored from the image.

Arduino:

void setup() {
Serial.begin(9600);
pinMode(A0,INPUT);
pinMode(A1,INPUT);
}

void loop() {
int sensor1 = analogRead(A0);
int sensor2 = analogRead(A1);
Serial.print(sensor1);
Serial.print(“,”);
Serial.println(sensor2);
}

Processing:

import processing.serial.*;
String myString = null;
Serial myPort;
int[] sensorVal = new int[2];
int x;
int y;

PImage img;

void setup() {
printArray(Serial.list());
size(870, 674);
noStroke();
img = loadImage(“JuHunKim.jpg”);
setupSerial();
}

void draw() {
updateSerial();
printArray(sensorVal);
x = sensorVal[0]*435/1023;
y = sensorVal[1]*674/1023;
for(int i=0; i<50; i++) {
int var = int(random(-50, -1));
int dev = int(random(-50, 50));
color c = img.get(x+var,y+dev);
fill(c);
int diam = int(random(1,10));
circle(x+var, y+dev, diam);
circle(870-x-var, y+dev, diam);
}
}

void mousePressed() {
saveFrame(“JuHunna.png”);
}

void updateSerial() {
while (myPort.available() > 0) {
myString = myPort.readStringUntil( 10 );
if (myString != null) {
String[] serialInArray = split(trim(myString), “,”);
if (serialInArray.length == 2) {
for (int i=0; i<serialInArray.length; i++) {
sensorVal[i] = int(serialInArray[i]);
}
}
}
}
}

void setupSerial() {
printArray(Serial.list());
myPort = new Serial(this, Serial.list()[ 3 ], 9600);
myPort.clear();
myString = myPort.readStringUntil( 10 );
myString = null;
}

Electrical Diagram:

Final Effect:

Player may mousePress to save the drawn image as JuHunna.png, a higher ordered art piece derived from 김주훈’s model photo.

Recitation 9: Media Controller by Ian (You Xu)

During this recitation, I intend to make a form of distortion of the capture of the live camera. Therefore, I referred to the caption sample code, processing website, and hooked up a potentiometer to achieve this goal.

  1. Green tint and low frame rate

When I was trying to run the sample code of capture, the screen has a green tint and refreshes very slow.

Green tint

Since the same code usually works in others’ devices, I was very confused. I thought it might because my computer – Surface Pro – has multiple cameras that work for different functions. I referred to the Processing website for capture: https://processing.org/reference/libraries/video/Capture.html and displayed all the available cameras for me to choose.

cameras

However, I tried nearly all of them, and no one worked. I looked it up online and found there is an “issue” under Processing’s Github repository https://github.com/processing/processing-video/issues/72. Many people who are using Surface with Windows 10 are encountering the same problem, and they do not have the solution to it. They believe it is the software problem of Windows 10 on Surface. Therefore, temporarily, I cannot fix it.

  1. Moving

By applying the sample code of capture to the serial communication sample code, I easily get the data from the potentiometer to the processing and use it to adjust the position of the capture. I use this step as a test of applying video function to serial communication. Again, one problem I found is that I need to be careful about the background refresh in the draw loop. This time, I do not want the previous images to leave on the screen, so I decided to draw the background in every loop.

  1. Scaling

I checked the capture mirror sample code and found out that it uses a “scale” function to make it the mirror effect. It gives me the insight to use the input from the potentiometer to control the scaling index. At first, I simply add “scale(sensorValues[0]);” to the code. However, I found out that the data from the potentiometer is ranging from 0 to 1023. It may scale the capture over 100 times bigger, which is too big.

Too big Scaling

Then, I realized I need to map it into a reasonable range first. I choose 0 to 2 here where 0 to 1 is shrinking, and 1 to 2 is amplification.

  1. Translate

However, this is still not what I expected. I want the capture always in the middle of the screen. Then I recalled another function “translate” that I used before to define that new origin. This required me to do math calculations. And I figure it out by geometric operation.

  1. Two dimensions

It works well. So, the final step is to hook up another potentiometer and make one of them scaling the x-axis and another to scale the y-axis.

2 potentiometer

Reflection

In the article “Computer Vision for Artists and Designers: Pedagogic Tools and Techniques for Novice Programmers” (Levin). it describes “new practitioners with an abundance of new application ideas, and the incorporation of computer vision techniques into the design vocabularies of novel artworks, games, home automation systems, and other areas.” I found two projects that use the capture function to build an authentic, interactive experience with users where they can see themselves. They are Cheese and LIMBOTIME. Therefore, I think to capture in the interactive project can increase the inclusiveness for the audience to engage their intention and communication with the project, which is correspondent to my definition of interaction that addresses the constant impact and inclusive communication environment. My work in this recitation would entertain the audience that they can see the distorted face in whatever dimension by interacting with the potentiometer. The only pity is the green tint bug that I am unable to fix.

Code: https://github.com/xuyou1999/InterLab_Fall_19/tree/master/Recitation_9_Code

Works Cited

Levin, Golan. “Computer vision for artists and designers: pedagogic tools and techniques for novice programmers.” AI & SOCIETY 20.4 (2006): 462-482.

Final Project Essay–Vivien Hao

Our Planet

As from the research Vivien has done for the first stage, she has realized unsustainability/climate change is one of the most vital issues that the United Nations has addressed lately. Vivien would want to work on something that illustrates the outcomes of unsustainability. But in the project, we would also like to show how humans could participate in the process of lessening the harms of unsustainability. Ian is highly interested in doing something that requires collaborations among people. He has proposed an idea of encouraging four people to make some movements simultaneously for the project to function. If only one person participates in the process, the project will not provide an outcome. Thus encourages collaborations among people, even among strangers. We both are highly interested in each other’s ideas. And we have worked together to combine these two ideas into one doable thought. Then we have figured out that we would want to create a project that requires movements among several people to lessen the unsustainable global issue. For the movement part, we want to make it very specific. The participants need to physically use a shovel and plant trees to push back the time of the earth explosion. The faster they plant trees, the longer the earth would stay away from the explosion. However, if the participant chooses to work alone, then there is no way for him to get fast enough to beat the accumulated speed for several people. We would set the number in our codes to be ridiculously high to make sure that no individuals could accomplish this goal by himself. In this project, we want to raise people’s awareness of the harmful outcomes of unsustainability. However, we would like to highlight the positive effect that we can bring forth through the collaboration processes. Our targeted group of the audience would be everyone on this planet. Overall, the earth is everyone’s plant. We all have to take part in helping to solve the unsustainable issue.

         We need to be certain about the detail of the design by prototyping and test it with the users. User test should be in every step. The details include: whether or not to include mask in our project; what types of sensors we decide to use; what material we will use; how will the user interface look like; sound, start and end, layout and position. We will test with the comfortability and the cleanness of mask. We will finish discussion for all the details and have a blueprint by Sunday, Nov. 24. We need to build the circuit and the program that are both scientific and humanized. This will be very practical work. We may encounter many problems. We will refer to online references and asking for help. Once when we finish the user interface, we will have someone to test playing with it to make sure it makes sense to the audience about the expression of our initial design. Finish before Dec. 1. We need to get the materials and build up the basic functional prototype. This is practical work as well. We only need to make sure it can support our project to work. It can be done by Dec. 1 as well. We need to test with the accuracy of counting the movement of shovel and breathe. Also bug fixing work. This process needs our constant revising and test among ourselves and our users. We will list out all the bugs the project has and fix them one by one until it works exactly what it is supposed to be. It should be done by Dec. 6. Beatifying the project and adding something to improve using experience and accessibility. This step we will need to decorate everything, make the wires invisible, test the projection. After we think it works, we will invite many groups of people to interact with it. By hearing form their feedbacks, we can make some little changes to better make it user friendly and accessible for as many people as possible. It should be done by Dec. 8.

Final Project Essay by Barry Wang

Final Project Essay – Aircraft War
A. PROJECT STATEMENT OF PURPOSE

The final project we would like to create is an aircraft shooting game called Aircraft War, in which the user controls the aircraft to attack other enemies to reach a high score or to finish a specific task.  We would like to create a strong sense of participation for the players who enjoy this kind of airspace, shooting themed game and all users who just would like enjoy the thrill of destoying and winning. 

Preparatory Research:

Sky Force Reloaded

This is a popular game that runs across plalforms from PC, mobile devices, Nintendo Switch and so on. Also, this is the game we would like to re-create. But the flaw of this game is that, though is a thrilling flying and shooting game, the players are always controlling the game by joystick, button or keyboard. Here, another research project enlightened us.

Kinect Flying Game

We would like to strengthen this experience by introducing a high level interaction. We would like to build our interaction on the level of all body. Which means that the user controlls the banking of the aircraft, shooting actions and other features by moving their body around. The fly lovers would always want to have an experience of flying, though we could not make it realistic, but still we try to improve their experience. This way of controlling adds fun and challenge in the game, which we believe would facilitate the gamers batter. Though Kinect and Leap Motion are banned, but we will try to figure a way out.

Besides, we would like to add some new elements to the game. Most of these shooting games are just composed of different levels for the users to pass. While we would like to add story to this game, so that the user does not only get an experience from the achievement made in the game but also viewing through a story. By doing so, we would like the game to be a kind of media that conveys some other emotions other than joy from the game itself, maybe sorrow, sad or relief, moving etc.

B. PROJECT PLAN

The plan goes parallel for me and Joseph. Joseph would try to write a story that fits the theme of this game, and prepare pictures, video and other forms of media. The preparation of the medias can last longer, but we would like to finsh the basic frame of the story by the end of next week. While on my side, I need to figure out a way as a substitution of Kinect. I need to check out the acclerometor, gadgeteer, distance sensors and see if some combination of these sensors can create a good user experience. This should also be finished by the end of next week. In the following week, we need to work together to finish the coding and fabrication process. We are going to help and cooperate, but basically I will be in charge of the coding part while he focuses on fabrication. Since we are going to consider the aesthetic value of our project, it also take some time to utilize and polish our project. In the last week, we are going to make some final adjustments to the project accoring to the feedback from the users and other new ideas.

C. CONTEXT AND SIGNIFICANCE

I would like to reclaim my definition of interaction again. A successful interaction should be continous, interesting, and strongly user involved. The most significant impact that the preparatory researches make is that, we need to give the users a continous and strong interaction experience. To make the interaction continous, we decided to make a game so that the user is constantly focusing on the input and output of the interaction. To make the game interesting, we tried to add different elements to the game. To make a stronger experience, we believe the way of controlling the game should be our whole body instead of just fingers and keys. It is these ideas that guide us to create a unique experiecnce that users cannot get from a similar game. The unique points we would like to realize are a improved, strongly user involved, body controlled game experience and the story mode in this game, which can bring the users some other feelings and emotions while playing the game. If this project works successfully, more improvement can be made such as using a better system of motion capture or using some relatively advanced game engine (like Unity) to add other complicated features and cool effects, so that the user experience improves in a deep and comprehensive way.