Walking in the Forest-Yiwen Hu- Inmi

CONCEPTION AND DESIGN:

We originally want to create a VR experience where the user can feel like they are physically interacting with the virtual world.  My definition of interaction is a continuous conversation between several actors.  The VR experience will encourage the user to explore their way of interaction with the virtual forest. Since our goal is to encourage the user’s way of interaction with the virtual world, we borrowed multiple sensors, ranging from pressure sensor to distance sensor.  We decide to test all those sensors first before selecting the specific ones to use because from the midterm experience I’ve learned that the functionality of sensors determine to a large part the outcome. So it’s better to ensure that the sensors work (the sensor can clearly identify a pattern so that I can monitor its value to serve my purpose).  In addition, we also think about the context of interaction. For example, we later abandon the sound sensor because it’s hard to think of the context in which human sound can generate effects on the environment.

FABRICATION AND PRODUCTION:

We finally choose distance and color sensor as our main sensors. The idea is that as people with different intentions approach the forest, there will be different effects shown on the screen. We aim to make people aware of the effects of their interaction with the forest. We decide to create 3 kinds of costume, each representing a symbol of human’s intention. For example, the white clothes with plastic attached represent human-induced plastic pollution. We cut down cardboards into several squares and paint them with different colors, like green, red and white. Fortunately the color sensor could differentiate between these colors. The user wearing the suit will be encouraged to approach the forest. During user test section, we were told that the interaction was not obvious and the user will not know that they have to run towards it to trigger the following effects. This is a tricky question for us. We ponder through it for some time. And later professor Inmi told us that the box had better be more relevant to what’s happening on the screen. A thought suddenly hit me: make the box a physical representation of the forest. Since our box also have a hole on top, I think of another way of interaction: instead of they wearing the suit and approach, the user could put elements into the box and see the corresponding elements. This completely changed our way of interaction of running towards the box.Our sensors are now reduced to only the color sensor. 

CONCLUSIONS:

I think our project, though deviating much from our initial plan, realizes the VR effect in some way. Through interaction with the physical forest, they’ll be able to see effects on the screen, which serves as a connection between virtual and real world. However, I think the interaction is not very engaging and a little bit confusing. From the feedback we get to know that the user find the throwing act confusing as they feel like they are compelled to throw and then receive an educational video. It’s not out of their intuition and therefore not engaging.  The ideal version in my mind goes like this: the user is put inside an immersive “forest” that has tangible existence, where they are encouraged to interact with the forest in any way they can imagine. For example, when they step on the grass,  the grass may generate some sounds to inform the user about their existence. Or when they are trying to approach a tree and touch it softly, the tree will slightly vibrate and generate a welcoming sound, but when the user tries to do too much like they are touching the tree in an less respectful and aggressive way the tree would be unhappy. In this way the user will realize that the nature also has their own feelings and are equal beings like us that need to be treated carefully. 

Our physical forest:

Recitation 9: Media Controller by Yiwen Hu

In this recitation I applied what we’ve learned in class, the “tint” function which can change the background color of the video. I decided to use the potentiometer to change one value of the RGB.

Step 1: Insert the video. 

As we’ve learned in class, in setup() I wrote myMovie function and access it to the video file. I put the video file in the same folder as the processing’s, but under another folder called “data.” Then in void draw(), I wrote like this:

The tint appears after the myMovie read() if statement. And the image function is to make it appear in the sketch. As for the x, it’s the variable I created to change the color. The x will vary based on the potentiometer value.

Step 2: Let Arduino communicate with Processing 

For this step I refer to the example code in previous class. There’s not many change to it except the number of value and port number. For the processing, I put a setupSerial() in void setup() and updateSerial() in void draw(), which are defined in the following code. Another thing I create is the map function that translate the value of the potentiometer to the x, whose range should be under 255. 

After building the circuits, I run the code and it worked!

Final Project Essay by Yiwen Hu

PROJECT TITLE: Reconsidering human-nature Relationship

PROJECT STATEMENT OF PURPOSE *

Our project aims at making people rethink about human-nature relationship. We gained the inspiration first from the art gallery on the 1st floor called the 72-relations with a golden rog. We would like to make an interactive project that allows people to rethink the way they can interact with nature. A website called Way To Go narrows our idea. As shown in the website, we decide to project the users on the screen and encourage them to explore their way of interaction with nature. Feedback will be given as users interact with the nature in the screen to make them realize their impact on the nature.

PROJECT PLAN *

We will make our project like a screen-based VR. Users will be encouraged to interact with multiple sensors outside the screen but the effect will be shown visually on the screen.  In terms of the actions, we will finish our shooting by Nov.23 and edit it on Nov.24. On Nov.25-27, we will be working on the code in Arduino and processing. We need to first figure out which sensors we need for users to “interact with nature.” Then we will be working on projecting the users as well as the animation. On Nov.28-30 we will try to modify the code on our own based on the guidelines from the fellows, professors or learning assistants. By Dec.2, we will roughly get the whole project done and try to run it. For the rest of the week, we will be focusing on digital fabrication and modifying our code based on some feedback like user testing.

CONTEXT AND SIGNIFICANCE *

Besides the “way to go” website, several other research also inspired me in preparation for the final project. For example, the interactive IKEA LED table projects user’s physical interaction to a larger screen and therefore enhances the user experience. Or the interactive playground project where the users are encouraged to give input (throw) continuously. It’s also immersive. The idea of immersion is important in our screen-based VR where the user’s input and output will be amplified on the screen. Together with the feedback the user receives from their input, they’ll be encouraged to give more inputs, which aligns with our definition of interaction that interaction should be in a “continuous” one.  We will base our project on the “way to go” website, but intends to allow more interaction than simply “looking” nature and walking around in it.

Our project is intended for anyone because human-nature relationship concerns the whole human body. We particularly want people who don’t care about nature and consider nature as disposable resources to rethink what nature is and our relationship to it. After successful completion, we want the project prompts people to rethink about our relationship to nature and realize the impact we’ve made on nature. Subsequent projects will meaningfully build upon that concept and further encourages humans to reflect on their perception of nature. 

Reference

  1. http://a-way-to-go.com/
  2. Interactive IKEA LED table https://www.youtube.com/watch?v=ptxulCpz6po
  3. Interactive playground. https://www.youtube.com/watch?v=3bCCyGcdNB0

Recitation 8: Serial Communication by Yiwen Hu

Exercise 1: Make a Processing Etch A Sketch

I first opened the example code called multi-value. Then I built the circuits by connecting the two potentiometers to the breadboard. Then in arduino code I changed the original number to 2 to match the two potentiometers. Then I run the arduino to check if the value change in potentiometers can be displayed in the arduino serial. After it worked, I opened the processing and created a circle in the draw()  with a black background line of code at the beginning to avoid trace of drawing. I also linked the position of the circle to arduino sensor value by creating x and y.

circles movement

Then I replaced the circle with line. To make it continuous, I delete the background at the beginning. The challenge is to avoid creating continuous straight lines. Inspired by what we’ve learned in class, I used the pMouseX and pMouseY to track the previous drawing. I also used a map function to ensure the lines moved inside the screen. The effect is shown as below. The potentiometers respectively control the movement along x and y axis. 

circles movement

Exercise 2: Make a musical instrument with Arduino

I built the circuits first by connecting the buzzer to the volt and the digital pin. Then like exercise 1 but in reverse order, I opened the processing first and changed the port and number of input to 2 because the variable is the frequency and duration of the buzzer. In processing I changed the input value to 2 too and also created a tone() function. tone (pinnumber, frequency, duration). Based on the principle, I linked the frequency and duration value to the value in two potentiometers. It worked like this!

The interaction is basically translating the analog value of the potentiometers to the values various different outputs. It enables interaction in various forms. 

Final Project Proposal by Yiwen Hu

PROJECT 1: The Anthropocene

Inspired from my PoH class (and also my partner Katie’s), we want to put the topic of the class the Anthropocene into practice. The Anthropocene is a new geological era where humans’ activities have profound effects on nature. The question of the Anthropocene calls into question humans relationship with nature. We want to create a project that prompts people to reflect on their interaction with nature by giving them environmental feedback on their behavior. Really, we want them to think carefully about the question: how much should humans interfere in nature?

PROJECT 2: Harry Potter around you!

This project’s idea derives from my partner’s love for Harry Potter. For this project, our targeted audience is anyone who is interested in Harry Potter. The user will have the chance to interact with scenes from the movie or even find themselves on the screen. I think it will be an engaging interaction experience because the result totally depend on users, such as the movement of the objects. In addition, inspired by interactive playground and interactive cloud project where a number of people is involved, we want to create a project that involve as many people as possible. We want the users to gain a kind of strength that may help them cope with daily challenges. 

PROJECT 3: Listen to the non-human

One interactive audio project online intrigues me. It is created by Playtronica Studio Showreel. The project aims to “immerse people in a reality that fuses touch, sound, and technology. One where everyday objects take on new identities, challenging us to creatively explore the World around us and our relationship with it.” This interests me at first sight because it relates to what I’ve learned in PoH class about humans’ relationship to the surroundings. I think it would be meaningful and interesting to bring such theoretical exploration into practice. The challenge would be to create an immersive interaction to make the user feel his/her connection to the objects and challenges his opinion about the lifelessness of objects. 

Reference

  1. https://www.youtube.com/results?search_query=meaningful+interactive+project
  2. https://www.youtube.com/watch?v=3bCCyGcdNB0
  3. https://www.likecool.com/An_Interactive_Cloud_Made_of_6_000_Light_Bulbs–Projects–Gear.html