• Skip to main content
  • Skip to primary sidebar

Celia's Documentation Blog

NYU Shanghai IMA

Interaction Lab

Final Project: Individual Reflection

May 16, 2021 by Celia Forster Leave a Comment

ENGAGE YOUR SENSES – CELIA FORSTER – COSSOVICH

CONCEPTION AND DESIGN:

I recall visiting a science museum when I was younger and being enthralled by an exhibit that relied on the user’s sense of smell to have its desired effect. As someone with a particularly strong sense of smell, my daily life always seems to involve it in some way. So while having a conversation with a friend of mine who suffers from colorblindness, my mind immediately went back to my fascination with scents. In my preliminary research for the final project, I discovered some art installations that involve scent, most notably ‘While nothing Happens’ by Ernesto Neto. Seeing this made me realize that although harnessing scents in a project may be difficult, it would not be impossible. These moments were the inspiration behind the concept of Engage Your Senses. 

A major part of designing Engage Your Senses was making sure it accomplished the task of blowing a scent at the user. Because scent is a relatively hard substance to control, there were many doubts as to how to make this possible. I decided on using fans that activate when a button is pressed for the selected color. During user testing, the main concern was that the opening for the scent to exit was not at an accessible height or angle for users to experience. This prompted the final design to feature tall columns that held the fans at a higher height. Another concern was that using so many different scents at once would overwhelm the user and they would lose the ability to differentiate between the scents. After doing some research, I learned more about the use of coffee beans as an olfactory palate cleanser, which inspired the addition of an “eraser” function that simultaneously blows the aroma of coffee towards the user. I think the adaptations made after user testing reflected constructive feedback and brought a more effective and user-friendly final design.

FABRICATION AND PRODUCTION:

The original design of Engage Your Senses was not too far off of the final design, but it underwent some extreme changes along the way. One major decision during the fabrication process was the question of how to successfully carry an aroma to the user. I originally intended to use small DC motors and fans to push the scent to the user as they drew with a selected color. But fearing the effectiveness of these fans, I changed my plan to somehow use a small perfume bottle and a servo motor to spray a mist at the user. While this would have certainly been effective, it also seemed impossible, as the small servo motors we had were not powerful enough. This took me back to my original design of using fans, and it proved successful in the end. After switching the setup from Processing to Arduino communication to Arduino to Processing communication, the implementation of buttons was necessary to control color selection. I decided that making a box with buttons would be the most user-friendly way of doing this. One thing that I wish I would have changed was making the buttons act as a toggle switch. In my final design, the user had to hold down the button to make the fan and drawing functions work. This seemed to confuse many users during the final presentation, and in hindsight should have been changed. 

Another major fabrication component of Engage Your Senses was how to create strong enough scents. My first instinct was to use essential oils in some way, as they are very concentrated liquid aromas, and this ended up working quite well when spraying a cotton pad and positioning it in the fan’s path. Overall, the fabrication of this project was not overly complicated. I did not end up using any sensors, as the buttons were the main control component paired with a wireless mouse to draw on the screen. The fabrication tasks were very evenly divided between myself and my partner, Derek. I was mainly responsible for writing the code and designing the visual aesthetics of the project. Derek worked hard to build a functioning circuit and utilized the laser cutter to create wooden boxes that would create a stable structure for the project. Because we have different strengths, we were able to work well together to construct a fully functional project. 

 

SKETCHES

Very rough sketches of the overall design

 

CONCLUSIONS:

The main goal for this project was to make a creative and interactive device that would be fun for anyone to involve another sense in a basic process, but would also be of use to people that cannot differentiate colors. I think that through my hard work and consideration of others’ suggestions, I achieved this goal by creating a successful project. My expectation when making this project was to see users drawing on the screen while also experiencing interesting scents. I hoped that this combination of senses would be something new and something that was exciting and fresh. When watching users interact with Engage Your Senses during the final presentation and during the IMA show, I was delighted to see people genuinely enjoying the time they spent with my project. Many users had smiles on their faces as they experimented with the project and discovered how it worked. As the semester concludes, I think my definition of interaction used throughout the semester truly fits the manner in which the user uses Engage Your Senses. Without the user’s selection of colors or movement of the mouse, nothing would happen. It requires the user to physically interact with various components to truly demonstrate its features. If given more time, I think the main thing that I would do is develop the drawing component to have more of an objective. Many users seemed confused as to what to draw, and while I thought just “drawing for fun” was a valid activity, I now think that I would have made it more of a game if I had more time. 

The process of making this project also presented many challenges, which taught me the value of patience. Sometimes, silly errors occur, and rather than become overwhelmed, it might just take adjusting the position of a wire to return it to its functioning state. Another thing I learned about myself in the design and production process was that I can be slightly uptight. Working with a partner, I learned the importance of compromise and understanding that perfection is not the most important in the end. Because this project required many hours of work, seeing the successful completion made me appreciate my own hard work and that of my partner’s. While it is a relief that it is completed, I really did find enjoyment in its creation.

I think this project truly did live up to my expectations and by going outside the box of traditional creativity, presents a fresh and exciting alternative to traditional digital drawing interfaces. By combining different human senses to enhance user experience, I can see this concept being developed and replicated in the future for more inclusive product design. Whether it’s visual or hearing impairment, I think utilizing things like scent could allow everyone to feel included and still enjoy simple experiences that are so accessible to able-bodied people.

TECHNICAL DOCUMENTATION:

ARDUINO:

// IMA NYU Shanghai
// Interaction Lab
// For sending multiple values from Arduino to Processing


void setup() {
  Serial.begin(9600);
}

void loop() {
  // to send values to Processing assign the values you want to send
  //this is an example
  int sensor1 = analogRead(A0);
  int sensor2 = analogRead(A1);
  int sensor3 = analogRead(A2);
  int sensor4 = analogRead(A3);

  // send the values keeping this format
  Serial.print(sensor1);
  Serial.print(",");  // put comma between sensor values
  Serial.print(sensor2);
  Serial.print(",");  // put comma between sensor values
  Serial.print(sensor3);
    Serial.print(",");  // put comma between sensor values
  Serial.print(sensor4);
  Serial.println(); // add linefeed after sending the last sensor value

  // too fast communication might cause some latency in Processing
  // this delay resolves the issue.
  delay(100);

  // end of example sending values
}


PROCESSING:
// IMA NYU Shanghai
// Interaction Lab
// For receiving multiple values from Arduino to Processing

/*
 * Based on the readStringUntil() example by Tom Igoe
 * https://processing.org/reference/libraries/serial/Serial_readStringUntil_.html
 */

import processing.serial.*;
PImage photo;
import processing.sound.*;


int NUM_OF_VALUES = 4;  
int sensorValues[];      

String myString = null;
Serial myPort;

void setup() {
//stroke(0);
  strokeWeight(30);
  //noStroke();
  //textSize(50);
  //fill(0);
  noStroke();

  fullScreen();
  photo = loadImage("final backgrounda.jpg");
   image(photo, 0, 0,width,height);


  setupSerial();
}



void draw() {
  getSerialData();
  printArray(sensorValues);

  
   if (sensorValues[3] > 100) { // A3 COFFEE RESET


   image(photo, 0, 0,width,height);
 
 }

 if (sensorValues[1] > 400) { // A0 LAVENDER
noStroke();
         fill(169, 121, 211);
  circle(1126,462,75);
       
   noFill();

       stroke (169, 121, 211);
 }
 
 else if (sensorValues[0] > 400) { // A1 STRAWBERRY
 noStroke();
         fill(226, 109, 109);
  circle(1126,462,75);
       

 noFill();
     stroke (226, 109, 109);
  
 }
else if (sensorValues[2] > 400) { // A2 ORANGE
noStroke();

     fill(247, 174, 77);
  circle(1126,462,75);
    noFill();
      stroke (247, 174, 77);
 
}
else if (sensorValues[0] > 400 && sensorValues[1] > 400) { // LAV + STR
noStroke();

     fill(193, 108, 187);
  circle(1126,462,75);
    noFill();
      stroke (193, 108, 187);
}

else if (sensorValues[2] > 400 && sensorValues[1] > 400) { // LAV + ORANGE
noStroke();

     fill(191, 144, 151);
  circle(1126,462,75);
    noFill();
      stroke (191, 144, 151);
}
else if (sensorValues[0] > 400 && sensorValues[2] > 400) { // STR + ORANGE
noStroke();

     fill(232, 121, 77);
  circle(1126,462,75);
    noFill();
      stroke (232, 121, 77);
}

else if (sensorValues[0] > 400 && sensorValues[2] > 400 && sensorValues[1] > 400) { // ALL
noStroke();

     fill(214, 138, 144);
  circle(1126,462,75);
    noFill();
      stroke (214, 138, 144);
}


else {
  fill(255);
   noStroke();
  circle(1126,462,75);
 
}

 delay(100);
      if(mouseY > 545 && mouseY < 1300 && mouseX > 295 && mouseX < 1950) {
  line(mouseX,mouseY,pmouseX,pmouseY);
     }
    
}




void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);

  myPort.clear();
  myString = myPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}



void getSerialData() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = '\n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}


 

The setup at the IMA show
Users trying out the project at the IMA show!
Users trying out the project at the IMA show!
The setup during final presentations
Laser cutting boxes!
Experimenting with 3D printing a servo attachment to trigger a mist bottle — not successful
The design and setup during user testing

 

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/05/vid1.mp4
https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/05/us.mp4
A diagram of the circuit– the Piezo discs are in place of the buttons
 

Filed Under: Interaction Lab

05.04.2021 In-Class Exercise: Pixels!

May 5, 2021 by Celia Forster Leave a Comment

In class, we built on our previous understanding of adding images to Processing by learning more about pixels. After understanding what exactly a pixel is, we saw different ways that they could individually be manipulated to change the overall appearance of an image. This can be done by using arrays that take each pixel and alter it in some way within seconds, no matter how large the image. Finally, we learned ways to manipulate pixels in real-time using the webcam. The in-class exercise was to load the array of pixels from a captured image and manipulate the pixels depending on the pixel’s x or y value. I decided to just play around with different effects and see the results I could create. I did this by manipulating the color values of the pixels, keeping the red and blue values at the value held in the index of the same number, but I used a random number up to 20 to determine the green value, with more green as the x-value increases. This resulted in many green pixels on the screen glittering because of the fast frame rate. Although it was a rather abstract effect, I thought it looked very interesting. 

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/05/sketch_210504b-2021-05-04-14-48-53.mp4

Filed Under: Interaction Lab

Interaction Lab Final Project: Progress Entry #1

May 5, 2021 by Celia Forster Leave a Comment

As of 05/05, here is the progress of our final project:

We have built off of our original sketch and made more sketches of the elements of the project and how they will function. We decided that instead of using a fan to blow the scents, we would use small misting spray bottles. The question then, was how to trigger these bottles to spray? After conducting some research, we decided that a servo motor could potentially press down and spray when needed. This idea will require further testing, but we are currently limited by when certain pieces arrive. We ordered the scents, servo motors, and spray bottles. We also made a lot of progress with the code. The drawing element of the Processing code is nearly finished. After learning the capabilities of using images in Processing, I decided to use an image as the background instead of manually creating the color boxes in Processing. This would lessen the complications of clearing the screen.  The background display is shown below. Part of making sure this worked properly required me to calculate the exact coordinates of each box so that the “color selection” would only be bound to the color box. I did this by making a line of text with X and Y coordinates appearing if I clicked on the screen. We planned out a daily schedule that would allow us to finish the project by user testing day. We had to choose priorities and decided that some of the visual elements might need to be implemented later before the final presentation. 

The background that will be used in Processing
My notes when determining the coordinates of the boxes
Very rough sketches of the overall design

Filed Under: Interaction Lab

Recitation 10: Media & Controls by Celia Forster

May 3, 2021 by Celia Forster Leave a Comment

In this recitation, I built off of previous abilities to connect Arduino and Processing by implementing visual and audio functions. The first exercise was to use the Arduino to Processing reference code to use a physical controller to manipulate the display of some media on Processing. I chose to use a potentiometer to control how clear an image would appear. At first, I tried using some reference code from the previous in-class work, but it was not working properly. The potentiometer seemed to not be sending the values to Processing, so I decided to re-write the code and try again. Using the blur filter, it began working, where the photo would become more or less blurry depending on the position of the potentiometer. A video is shown below:

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/05/WeChat_20210502170011.mp4

The second exercise was to use the Processing to Arduino reference code and make a servo motor move according to input from the microphone. I struggled for a while, trying to get the microphone input to send from Processing to Arduino. I found that I could use the map function and the sample code for audio analysis. A video is shown below. I was running out of time, so it was hard to tell how accurate it was. But as people were speaking around me, the servo motor was moving according to the volume.

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/05/WeChat_20210502170046.mp4

In “Computer Vision for Artists and Designers”, the author discusses various ways in which human-computer interaction can be created in the world of art. Some interactive technological art installations are introduced, with many requiring the movement of the human body, which is tracked by a digital program. One of these that is particularly interesting is Rafael Lozano­ Hemmer’s Standards and Double Standards (2004). It features belts hanging from the ceiling which turn with servo motors towards the user walking through the room. Although pieces like these are on a much larger scale than those made in class, I see the connection between the two. Similar to those mentioned in this article, the recitation features human-computer interaction when physical devices act according to movements or sounds made by the user, and conversely the movement of a physical device manipulating an image or sound.  

Filed Under: Interaction Lab

Recitation 9: Serial Communication by Celia Forster

April 28, 2021 by Celia Forster Leave a Comment

In this recitation, I used the code resources provided in the serial communication folder. The first exercise was to create an Etch a Sketch by sending the Arduino values from two potentiometers to Processing. While it was rather simple to do this with a circle, I wanted a smoother line to make it as close to a real Etch a Sketch as possible. After doing some research online for reference code, I learned that in order to get the line function to work properly, I had to set up variables for a previous X and Y. The video of it working properly is shown below:

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/serial_AtoP_processing_rec-2021-04-23-14-28-37.mp4
A schematic of this circuit

The second exercise in this recitation was to make the Arduino control servo motors on each side of the computer screen, where a ball moves from one side to the other. Creating the code to move the ball from one side of the screen to the other was rather simple. However, I met difficulty in making the servo motors act according to the code. One of the motors kept vibrating rather than moving at the 90-degree angle my code directed it to do. As it turns out, this was due to an issue with the delay. Once I added a delay, it moved properly. The result was not perfect, but due to time constraints, I was unable to make the movement of the motors smoother, but the result is shown below:

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/WeChat_20210425214822.mp4
A schematic of this circuit

Homework

The homework for this recitation was to make a circuit that has two buttons and make them interact with the code in Processing to make rotating shapes appear and disappear by pressing the buttons. I first set up the buttons on my breadboard and wrote the code to produce the rotating shapes in Processing. I decided to make these components separate initially to ensure that both sides of the project were working properly. Connecting the two proved to be much more difficult, as I was able to get the shapes to appear for the duration of the button being pressed down, but coding it to be a toggle switch was harder than I had expected. I finally got it to work properly after using a combination of if-statements and while-loops. It is intermittent at times, but I consider it to be mostly a success. A video of this project is shown below:

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/WeChat_20210425214852-1.mp4
A schematic of this circuit

Filed Under: Interaction Lab

Final Project: Essay

April 18, 2021 by Celia Forster Leave a Comment

PROJECT TITLE:

Engage your Senses

 PROJECT STATEMENT OF PURPOSE:

Since the rise of technology, digital painting apps have become a popular choice for users old and young to express their creativity, no matter their skill level. But it got me thinking how 1-dimensional this medium was. While it does involve the senses of touch and vision, it does not go any further. What if someone was colorblind, but still wanted to express their creativity in this way? The idea behind Engage your Senses is to create a more interesting and inclusive approach to digital painting.  I wanted to design an interactive piece that would not just release the same scent when the user interacts. In my definition of interaction, which is based on Chris Crawford’s definition in The Art of Interaction Design, the interaction cannot be considered as such if it does not produce varying reactions depending on the user’s behavior. Engage your Senses would do just that, as the scent combination produced by the user would be unique to the ratio of colors used in their painting.

 PROJECT PLAN :

This project aims to create an elevated experience for users who enjoy digital painting apps by engaging more senses. This will look like a digital screen and fabricated set of boxes in their designated colors. The screen will feature a white canvas with small colored squares in the corner to show the user which colors they can choose from. When users select a color, the action of swiping their finger around the screen will trigger the door of the corresponding color box to open and a fan to turn on and blow the scent towards the user. These scents will be spices or essential oils. When presenting this idea, many peers expressed the concern that spraying so many scents into the air will overwhelm users’ senses of smells and render them unable to differentiate different scents. After conducting research as to how perfume shops combat this issue, I found that using coffee beans might help cleanse the olfactory palate. To execute this, I want to add an “eraser” function that, just like the other colors, will blow the scent of coffee beans as the user erases the screen. In order to verify that this method will solve this potential issue, it will require user testing. To ensure that users will have a positive experience creating a symphony of aromas, I will have to take time to make sure the scents smell nice alone and when combined with each other.

The first step is to create the code that produces the initial canvas. Next is the fabrication of the scent boxes, which will involve motors to operate the fans and lift the doors of the boxes. Finally, the code that controls the canvas must be manipulated to connect to Arduino and the boxes. Once this is done, user testing must be conducted to see if everything runs smoothly.

 

CONTEXT AND SIGNIFICANCE:

I was drawn to the world of olfactory art while doing my initial research for this project. A particular piece that inspired my idea is ‘While nothing Happens’ by Ernesto Neto. This piece has various spices stored in lycra netting which hangs at various heights around a room. As the audience walks through, their body brushes against different scents depending on their path and size, which releases different scents. Building off of the concepts presented in this piece, my project will bring the unique aspect of interaction with technology. This project can have a great impact on many people but will bring a particular advantage to those who are colorblind. Since people who are unable to differentiate color would have great difficulty in selecting colors to make a piece of art, this project would create an inclusive solution to allow them to make a purposeful composition. Further ways to expand the influence of this project include adding more inclusive features, such as sounds that will also play in conjunction with the colors/scents, for example, green is associated with nature sounds. By implementing ways to engage the other senses, this project could reach a broader audience within the realm of inclusivity, including those who are visually impaired.

 

Filed Under: Interaction Lab

Recitation 8: Arrays by Celia Forster

April 17, 2021 by Celia Forster Leave a Comment

In this recitation, I used the watermelon function created for the previous recitation and enhanced it by implementing an array. Replacing the original code with arrays was not too difficult, as I have prior experience using arrays with Java. This was a nice review, which reminded me how much cleaner the code can be when they are used! Next, I added some movement, making the watermelons rain from the top of the screen. By removing the replace background line of code, it had the effect of the watermelons “painting” down the screen, which I thought was quite beautiful. But I did not just want them to fall off the screen and disappear, so with some trial and error, I was able to get them to continue cycling down the screen continuously. My final step was implementing user interaction. I added a mousePressed function,so that when the user presses the mouse the background will clear and change color. To make sure users understand this, I added some instruction text in the bottom of the screen. A video and code are shown below:

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/sketch_210416a-2021-04-16-14-59-01.mp4

 

float[] arX = new float[100];
float[] arY= new float[100];
color[] arCo = new color[100];
float[] speed = new float[100];
int x = 0;
int y = 0;
  float value = random(0,100);
  

void setup() {
  size(1200, 1200);
background(160,208,233);
 // for(int i = 0; i < arX.length; i ++) {
 //  arX[i] = random(0,width);
 //  arY[i] = (random(0,height));
 //  arCo[i] = color(random(200,255),random(0,200),200);
 ////  x+=(width/50);
for (int i = 0; i <arX.length; i ++) {
  arX[i] = random(width);
  arY[i] = random(500);
  arCo[i] = color(255,random(0,200),random(0,100));
  speed[i] = random(1,6);
  textSize(50);
}
  }


void mousePressed() {
 background(random(100,200),random(150,250),random(200,255));
}

void draw() {
 
  //background(160,208,233);
for (int j = 0; j < arX.length; j ++) {
  melon(arX[j],arY[j],arCo[j]);
  arY[j] += speed[j];
  if(arY[j] > height) {
    arY[j] *=-1;
  }
}
 text("Press mouse to clear background", 30, height-30);


  
}




void melon(float x, float y, color co) {
   
   stroke(co);
  strokeWeight(10);
  fill(255);
    arc(x,y-15, 150, 150, 0, PI);
    line(x-72,y-17,x + 72,y-17);
    noStroke();
 fill(co);
 arc(x,y-3, 100, 90, 0, PI);
noFill();
stroke(103,146,103);
strokeWeight(10);
  arc(x,y, 100, 80, 0, PI);
  noStroke();
  fill(255);
  for(int seed = 0; seed <75; seed +=15) {
    circle((x - 30) + seed,y + 6, 3);
  }
   for(int seed = 0; seed <60; seed +=15) {
    circle((x - 22) + seed,y + 17, 3);
  }
  


}

The Benefit of Arrays:

Arrays serve a very important purpose in code, as they allow for sorting, storage of data, and the overall cleanliness of the code. It holds any amount of values under a single variable, which is very efficient when searching the data for a particular value. In a potential project, arrays could be used to gather data via a sensor and accordingly return an action if a certain number is read in the array. Or, in the case of a user input device, the words that the user inputs could be sorted to accomplish some task.

Homework

The homework for this recitation was to create a grid of squares using arrays and fill them with random colors, make these colors fade hues, and eventually implement user interaction to have the mouse manipulate the square size. While this task seemed rather daunting, I completed the first part with ease, besides the color part. I had trouble getting each square to have a different color, as the for-loops kept making either entire rows or columns the same color. To solve this, I changed the color array to a 2D array, so that each square would receive its own random color. Next, making the colors change led to many failed attempts. I was able to make the colors change, but they were changing too drastically and not smoothly as shown in the instructions. I saw that it was recommended to use the hue function to do this correctly, so after reading the hue function documentation, I tried a few if-statements and eventually got it to work pretty smoothly. Finally, the part which intimidated me the most: incorporating user interaction to change the sizes of the boxes. Luckily, the dist function very nicely calculated every distance between the mouse coordinates and the squares in the grid. I used multiple if-statements so that the closer the mouse was to the square, the smaller the box would be. The video of this project and the code are shown below:

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/sketch_210416a-2021-04-17-19-02-06_Trim.mp4

 

color[][] fill = new color[25][25];
float[] x = new float[25];
float[] y = new float[25];
float dim = 21;
int distance = 235;
void setup() {
  background(0);  
  size(1000, 1000);
  colorMode(HSB, 360, 100, 100);
  for (int k = 0; k < fill.length; k ++) {
    x[k] = distance;
    y[k] = distance;
    distance+=21;
  }
  for (int i = 0; i < fill.length; i ++) {
    for (int h = 0; h< fill.length; h++) {
      fill[i][h] = color(random(100), random(200, 300), random(200, 350));
    }
  }
}

void draw() {

  for (int j = 0; j < 25; j++) {
    for (int h = 0; h< 25; h++) {

      float distance = dist(mouseX, mouseY, x[j], y[h]);
      strokeWeight(5);

      fill(fill[j][h]);
      if (distance<60&& distance>40) {
        rect(x[j]+5, y[h]+5, dim-5, dim-5);
      } else if (distance<40&& distance>20) {
        rect(x[j]+5, y[h]+5, dim-8, dim-8);
      } else if (distance<20) {

        rect(x[j]+5, y[h]+5, dim-10, dim-10);
      } else {
        rect(x[j], y[h], dim, dim);
      }
      if (hue(fill[j][h])<1) {
        fill[j][h] ++;
      } else {
        fill[j][h] --;
      }
    }
  }
}

Filed Under: Interaction Lab

Interaction Lab: Final PROJECT PROPOSAL

April 13, 2021 by Celia Forster Leave a Comment

  1. SMELL YOUR ART

STATEMENT OF PURPOSE

Smell Your Art is an interactive painting experience that can upgrade the classic hobby and potentially act as an inclusive design for colorblind individuals. This project entails a digital screen that gives the user a few colors to choose from. Once the user selects a color and begins drawing on the screen, a motor attached to Arduino opens a panel and a fan blows a specific scent that correlates with the color towards the user, i.e. mint for green, cinnamon for red, etc. While researching interactive art projects, I was particularly intrigued by ‘While nothing Happens’ by Ernesto Neto, which was an art exhibit that used sense of smell as the main route of experiencing the art.

A sketch of this design
  1. MUSICAL CHALLENGE

STATEMENT OF PURPOSE

The Musical Challenge falls under the category of music and can be enjoyed by those who have an interest in making music of their own. A popular genre of mobile phone games is piano games, which feature tiles sliding down the screen at an increasingly fast pace, and the user must tap the tiles before they reach the bottom of the screen. As a result, each tap equals a note which is played, and the user will be left with a song. While many mobile games like this already exist, it would be essential to take it in a more unique direction to create a larger impact on users.

  1. BEAT MIXER

STATEMENT OF PURPOSE

The Beat Mixer falls under the category of interactive music. It would feature a screen divided into a grid, and a portion of the screen would feature a toolbox of musical instruments. The user would drag and drop the instruments to different parts of the screen, and they would play simultaneously as a result. This is similar to some features in digital music production software. It could be used by professional musicians and beginners alike, as it offers a fun experience to experiment with different sounds, volumes, tempos, etc. even if you do not have access to all of these instruments.

 

Filed Under: Interaction Lab

Recitation 7: Functions and Interaction by Celia Forster

April 12, 2021 by Celia Forster Leave a Comment

In this recitation, there were three parts that each built off of the previous. The first task was to create a grid pattern using a graphic made of simple shapes from a function. I chose to create a function called ‘melon’ that would take three parameters: x-position, y-position, and color. Using nested for-loops, I created a grid as seen in the photo below, with alternating colors.

 

float x, y, angle;
float xc,yc;
float speedx = 3;
float speedy = 5;
void setup() {
  size(1200, 1200);

}

void draw() {
  //circle(x,y,50);
  background(160,208,233);

  pushMatrix();
 // translate(width/2,height/2);


 // melon(200);
   scale(0.8);
   //  rotate(radians(angle));
   //translate (-width/2,-height/2);
for (int i = 0; i < width*7; i += 201) {
  for (int h = 0; h <height*7; h+=121) {
 if (h%2 == 0) {
  rotate(PI);
   melon(i/2,h, color(255,69,69));
  }
else{
    melon(i/2,h, color(255,135,132));
}
//translate(width/2,height/2);

}

 
}
  popMatrix();
}

void melon(int x, int y, color co) {
  
   stroke(co);
  strokeWeight(10);
  fill(255);
    arc(x,y-15, 150, 150, 0, PI);
    line(x-72,y-17,x + 72,y-17);
    noStroke();
 fill(co);
 arc(x,y-3, 100, 90, 0, PI);
noFill();
stroke(103,146,103);
strokeWeight(10);
  arc(x,y, 100, 80, 0, PI);
  noStroke();
  fill(255);
  for(int seed = 0; seed <75; seed +=15) {
    circle((x - 30) + seed,y + 6, 3);
  }
   for(int seed = 0; seed <60; seed +=15) {
    circle((x - 22) + seed,y + 17, 3);
  }
  


}

 

Next, I was to make a custom pattern by using the mouse functions to allow the user to draw the pattern wherever they click on the screen. In my opinion, this part of the recitation was the simplest, as it only required a mousePressed method with a simple command. The more complicated part was making sure the screen would clear if the delete key was pressed. While I had originally put this if-statement in the draw method, I realized this too needed to be placed in the mousePressed method to work properly. I also altered the code so that the color of the pattern would change with each click, as seen in the video below.

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/melon_2-2021-04-09-14-04-16_Trim.mp4

 

float x, y, angle;
float xc,yc;
float speedx = 3;
float speedy = 5;
void setup() {
  size(700, 700);
background(160,208,233);
}
void keyPressed() {
if(keyCode == DELETE) {
 background (160,208,233);
}
}
void draw() {
  //circle(x,y,50);
  

  pushMatrix();
 // translate(width/2,height/2);


if (mousePressed == true) {
 
 
melon(mouseX,mouseY, color (255,random(0,200),random(0,100)));
}


  popMatrix();
}

void melon(int x, int y, color co) {

   stroke(co);
  strokeWeight(10);
  fill(255);
    arc(x,y-15, 150, 150, 0, PI);
    line(x-72,y-17,x + 72,y-17);
    noStroke();
 fill(co);
 arc(x,y-3, 100, 90, 0, PI);
noFill();
stroke(103,146,103);
strokeWeight(10);
  arc(x,y, 100, 80, 0, PI);
  noStroke();
  fill(255);
  for(int seed = 0; seed <75; seed +=15) {
    circle((x - 30) + seed,y + 6, 3);
  }
   for(int seed = 0; seed <60; seed +=15) {
    circle((x - 22) + seed,y + 17, 3);
  }
  


}

 

Finally, I needed to use the function to create a random pattern that would display a hundred objects from the function scattered randomly across the canvas. I also used a for-loop to accomplish this, using random variables for the x and y coordinates. The result is shown below:

 

float x, y, angle;
float xc,yc;
float speedx = 3;
float speedy = 5;
void setup() {
  size(1200, 1200);
noLoop();
}

void draw() {
  //circle(x,y,50);
  background(160,208,233);


  pushMatrix();


 // melon(200);

   //  rotate(radians(angle));
  // translate (random(0,width),random(0,height));
   for(int i = 0; i <300; i++) {
 //translate(random(0,width),random(0,height));

   melon(random(0,width), random (0,height), color (255,random(0,200),random(0,100)));
      rotate(random(0,PI/2));
  //melon(0,0, color (255,random(0,200),random(0,100)));



}
  popMatrix();
}
void melon(float x, float y, color co) {

   stroke(co);
  strokeWeight(10);
  fill(255);
    arc(x,y-15, 150, 150, 0, PI);
    line(x-72,y-17,x + 72,y-17);
    noStroke();
 fill(co);
 arc(x,y-3, 100, 90, 0, PI);
noFill();
stroke(103,146,103);
strokeWeight(10);
  arc(x,y, 100, 80, 0, PI);
  noStroke();
  fill(255);
  for(int seed = 0; seed <75; seed +=15) {
    circle((x - 30) + seed,y + 6, 3);
  }
   for(int seed = 0; seed <60; seed +=15) {
    circle((x - 22) + seed,y + 17, 3);
  }
  


}

 

  • In the reading “Art, Interaction and Engagement” by Ernest Edmonds, he identifies four situations in an interactive artwork: ‘Static’, ‘Dynamic-Passive’, ‘Dynamic-Interactive’ and ‘Dynamic-Interactive(Varying)’. From the exercise you did today which situations you identify in every part you executed? Explain.
    • In this exercise, part one would likely be classified as static, because it is simply a grid pattern and does not change when the user interacts; it exists solely for viewing. I think that part two would be classified as dynamic-interactive, because it requires the user to interact by either pressing the mouse or the delete key to alter the canvas. Because the color of the melon is a different color with each press of the mouse, the result can never be predicted. Part three could also be considered static, because once the random pattern is generated, it is still just an image to be viewed by the user and cannot be interacted with.
  • In today’s exercises you designed something that allows humans to interact with technology. How do you think that you can re-design what you did today, using Processing and Arduino, to allow humans to interact with humans through technology? Explain.
    • I think that a way for humans to interact with humans through technology, using Processing and Arduino, could be to redesign the preexisting code to incorporate an arduino and use user interaction to transmit a message to another device. I am not sure of the particular details of how this could work, but implementing a sensor might allow users to send some sort of signal to another user.
  • If you had to connect what you did in Part 2 today with Arduino, what sensor(s) would you use to replace the mouse? Why? How do you think this or these sensors will increase the degree of interaction or make the interaction more meaningful?
    • If we were to replace the mouse with a sensor, we could potentially use a pressure sensor. We could fabricate a “screen” of sorts that would lay flat on a surface. It could be made of some sort of fabric, and as users touch a portion of the fabric, the corresponding area would draw on the screen based on the location of the sensors. We would have to use many sensors to cover every portion of the screen. This might make the interaction more interesting, because the user would have a tactile experience and the logic behind the process would not be so self-explanatory.

 

Research: 

A few weeks ago, when we first started using Processing, I began researching whether there was a way to use voice detection, because I felt that this would be a great addition to the final project. Looking on the Processing official website, I read the article titled “Sound”, and saw that Processing can do quite a few things with sound. One of these that caught my eye was audio analysis. This function takes a sound sample and creates an array that stores frequency values. While this seemed interesting, it did not apply to my intended final project, so I continued on this path to see what other things Processing could do with audio. I discovered that I could download some speech-to-text libraries which could be implemented in Processing. I am fascinated by the different interactions that are made possible with Processing, and hope to use some of these ideas in my final project.

 

 

Filed Under: Interaction Lab

04.08.2021 In-Class Exercise – Functions

April 8, 2021 by Celia Forster Leave a Comment

Today in class, we focused on functions. Because I have some prior knowledge of coding, I focused on practicing user-defined functions. Using Richard Bourne’s code titled StarFlight, I started with a star in the center of the screen. Next, I enhanced the codes that the star would spin continuously. In order to practice user-defined functions, I created a function titled ‘smile’, which draws a smiley face according to the size passed through it. Once I successfully made a smiley face, I added some additional code that would make the circle continuously change color and move around the screen, bouncing off of the borders of the canvas. Watch this project below!

https://wp.nyu.edu/nyushanghai-celiaforster/wp-content/uploads/sites/19366/2021/04/sketch_210408b-2021-04-08-20-14-02.mp4

 

 

float x, y, angle;
int h = 0;
int i = 50;
int j = 200;
float xc, yc;
float speedx = 3;
float speedy = 5;
void setup() {
  size(600, 600);
  x = random(width);
  y = random(height);
  colorMode(HSB, 255);
}

void draw() {
  //circle(x,y,50);
  background(255);

  // this code was taken StarFlight by Richard Bourne
  //  from https://openprocessing.org/sketch/1152776
  pushMatrix();
  translate(xc, yc);
  scale(.7);
  rotate(radians(angle)/2);
  //beginShape();
  smile(200);
  xc += speedx;
  yc += speedy;
  //vertex(0, -500);
  //vertex(150, -200);
  //vertex(470, -150);
  //vertex(230, 70);
  //vertex(290, 400);
  //vertex(0, 250);
  //vertex(-290, 400);
  //vertex(-230, 70);
  //vertex(-470, -150);
  //vertex(-150, -200);
  //endShape(CLOSE);
  popMatrix();

  angle  += 5;
  if (xc < -2|| xc > width + 2) {
    speedx = -speedx;
  }
  if (yc < -2 || yc > height + 2) {
    speedy = -speedy;
  }
}

void smile(int size) {
  noStroke();
  //strokeWeight(10);
  fill(h, i, j);
  h++;
  i++;
  j++;
  if (h==255) {
    h=0;
  }
  if (i==255) {
    i=50;
  }
  if (j==255) {
    j=200;
  }
  circle(0, 0, size);
  fill(0);
  circle(-40, -20, size/8);
  circle(40, -20, size/8);
  noFill();
  stroke(0);
  strokeWeight(10);
  arc(0, 25, 100, 50, -PI/30, PI);
}

Filed Under: Interaction Lab

  • Page 1
  • Page 2
  • Page 3
  • Go to Next Page »

Primary Sidebar

Archive

Course

  • After Us: Posthuman Media (6)
  • Communications Lab (14)
  • Interaction Lab (22)
  • Working with Electrons (9)

Recent Posts

  • Final Project PART FOUR: Documentation May 16, 2021
  • Final Project: Individual Reflection May 16, 2021
  • 05.04.2021 In-Class Exercise: Pixels! May 5, 2021
  • Interaction Lab Final Project: Progress Entry #1 May 5, 2021
  • Final Project: Progress Entry #2 May 5, 2021

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Copyright © 2025 · News Pro on Genesis Framework · WordPress · Log in