PROJECT PROPOSAL by Steve Sun

name of the project: Life Line

for our project, based on our research, we generated two versions of a project that’s based on the same idea. 

first thing first, without special explaination, anything mentioned below with “” would be the terms in the content of this project.

general idea: in this project, we aim to create a virtual world where every  “life” will have to be “in contact” with other “lives”, or it would die or “fade out” easily. the more contact one “life” makes with other “lives”, the longer it would stay “alive”. the thought behind it is that we as humans live in a world where human contact and interaction and communications are important. we wish to stress on these links among humans. 

the first version of the project would be based on a projector and camera, along with our hand gadget which contains the arduino and other circuits. the camera is hung on the celing and detects the position of anyone who wears and activates the hand gedget, while the projector displays a shade of light on the person that represents a “new life” created. when two gadgets come closer to a certain range, they will detect ecah other and send a signal to the projector, which in turn displays a beam of light which represents the “link” or “contact” between the two “lives”. each one of the life has a rendom-lengthed health bar displayed on the gadget which decreases along with time. the more “links” a life has, the slower the health bar decreases. so inorder to stay alive, one must keep making new contacts. 

the techical problem we currently imagined would be how could the light follow the movent of the users, and how could one “life” be individually identified. and how to record it when two lives make contact.

the second version would be rather than displaying the light shades and the connecting lines on the ground, we could make animations and display on the screen, which could be a lot easier to do but meanwhile could be a bit confusing if the user should focus on the screen or each other. it is up tp us to weigh the factors and decide. 

the targeting audience would be general public, and the goal is to get the users to think about how unusual do people get along and make contact in person rather than through digital media. we try to remind people of the importance of human connection.

Final Project: PREPARATORY RESEARCH AND ANALYSIS

still since i missed the field trip and havent had any time to go to other shows, i will keep this part as is now. i will update this part when i had time to visit some more shows.

for our project, since we had our idea first, the reasearch we did was goal-oriented. each one of our research gives us ideas on each part of our project. here are the links of our research.

Interactive Whiteboards Using the Wiimote [Windows]

↑ this project i would consider as a successful but boring interactive ecperience. the reason is that by the first glance, the users would already be able to tell how things will work when they are using. and that’s probably because the concept of a digital white board is quite old and everyone is familiar with it. however, the lillustrition in the video is what i found interesting. the technology or the logic could help us finish the animation part of the project.

Epilog – Interactive Light and Shadow by Schnellebuntebilder

↑ this is very cooooooool! this in my definition is a successful and interescing and creative project of interaction. and it also align with my definition of interaction that the project should give the user a space to create stuff, rather than an input an out put and that’s it. also this project gives us a insight on how the final project could look like.

Interactive Laser Sculpture by Jayson Haebich

personally i would not regard this project as a good interactive project because of the same reason, however, it could privide us the specific method on how to track things under the projected light. 

what i would define as successful interactive project is that it should achieve a clear goal, such as to explore how human body and machines could collaborate in performance art. if, again, the project is merely an input and an output or even if it cannot fully achieve its goal but instead confuses the users, it would not be a successful project. 

recitation6:Processing Animation by Steve Sun

for this week we built an animation of a moving circle

the critical thing about this week’s coding is to set a changing variable ad st it to be relevant to the color and the radius of the circle.  

one thing i notices about the example is that when it reaches the biggest and smallest radius, the speed of radius change seems to slow down a bit. so i decided that the changing variable should be sin(another changing variable). 

so after calibration it should be:

stroke((sin(i)+1)*127.5, 255, 255);

size = (sin(i)+1.1)*272;

while i is a variable that increses by 0.05 in each draw loop.

another thing i learned is that keyPressed() is a built in function that don’t need to be called in the draw() function, and also if the key is coded,  i have to put keyCode == ??? or it won’t work.

if i had more time i could finish the bonus point. the logic would be: if the distance between the center of the circle and the border <=the radius, reset the center so that its distance to the border equals to the current radius.

final version:

Recitation 5: processing basics by Steve Sun

fig.1: Sol LeWitt (1928-2007). Color Bands, set of eight, 2000. Linocuts in colors on Somerset Velvet paper. 

for this week’s recitation i did a immitation of this image above. i used for loop to create three series of circles, each taking different position in the canvas, and each single one of them has a different random color. the final effect looks like this:

a bit messy……

for further development i could investigate more into the color generation process. readom color seems less contrast thus more “boring”. i could have generated more distinctive colors just as the original picture did. 

also it could be more interesting if i could add more interactive elements, such as only on mouse hover can the mage change cover or stuff like that.

Interaction lab midterm project blog post

Mid project blog post

About my project

My project is called Hear the Colors. It consists of two sensors (color sensor and ultra-sonic distance sensor), an Arduino board, a box with handle, a set of head phone (or speakers), and a power bank. pic1

The function of this equipment is that it senses the color and the distance of a certain material placed in front of it and uses the two data as input, then it plays a piano note according to the input. As the color and the distance changes, the pitch and the volume of the note changes respectively. 

In my group project 1 I came across two projects that triggered my thinking of the definition of interaction. I’m quoting myself on describing the two projects. ” The two projects that triggered my thinking are click canvas by KIMBABSTUDIO (https://www.creativeapplications.net/member-submissions/click-canvas/) and Vivienne La – 21st Century Ghost (https://www.creativeapplications.net/maxmsp/sfpc-spring-2019-student-showcase/). The former one aligns with my definition because the users, by clicking on the buttons of the canvas, can create certain picture(s) according to their will; and that, to me is user’s involvement being high and the creativity appearing. However in the second project, as high as the aesthetic value of the whole project, the interactive part or the supposedly “interactive” part of the project doesn’t align with my definition. in this project, when you put a certain number of sticks in a certain way in the vases, different images will be shown on the screen accordingly. This to me is simply like using a knife to cut a piece of wood. An input (force), an output (wood is cut), everybody knows how the output would be on a single movement like this if informed enough of the rules implanted.” As a result, my project aligns with my definition of interaction as it allows the user to experiment the relationship between the notes and the colors and create something out of it.

The target audience of my project is general public. Because it provides them a new way of looking at their senses. When this project connects colors and sounds it combines visual and auditory senses together and thus providing people a new experience of sensing everyday objects and provoking a new way for people to think about human senses and how they play a part in people’s daily life.

CONCEPTION AND DESIGN with FABRICATION AND PRODUCTION:

       One big change during the earlier fabrication process was that we got rid of any excess cables and the two bread boards we were using in order to minimalize the size of the product.

       Our initial thought was merely users grabbing the device and immediately know where the sensors are and how to scan, so we put everything together on a box with the two sensors facing the same direction. However, during the user test session we received some feedbacks regarding how to let the user know how to use the device without actually telling people exactly what to do for it is not practical. So we re-designed the device box to have a pistol-like handle on which the user can grab and immediately know which direction to point to. We considered shapes like a torch or microphone could to the work too but in real life there are less such straight-shaped handle that are used to receive information, so the pistol-shaped handle could give people more straightforward instruction than the other shape.

The reason why we used 3D printing to make the handle was that it is easier to make a handle that is easy and comfortable to grab onto.

Another useful feedback we got in the user test was that the response of the device seemed so long that it affects the user experience. In general, the users would want more active response from the devices and what we saw at the user test (that people almost always switch to another color before the speaker could even display the note) confirmed that. So we modified the music files and shortened the delay time to make the response more active.

CONCLUSIONS:

       The goal of the product was to explore how human’s senses could be extended. And the product, through making a link between the visual and audial senses, achieves this goal. And through using our project and their own creativity, the users could create sound pieces to their liking. And that aligns with my definition of interaction perfectly.

       If I had more time, there are definitely more things we could improve on. Firstly we would add a button or anything that could trigger the system working. And also it would be nice to be able to hide the power bank in the handle so the device looks less intimidating to use and less cyborg-ish. Last but not least the overall shape could be re-designed to look less like a gun and more like the qr code scanner. My biggest take away from this whole process is that there’s always another way to achieve a simple goal; its just the matter of if I could ask people and find out.

       To look at it in a more critical way, overall the project did fulfill most of my goals and definitions, but due to the time limit and the lack of mature coding silks, the project reacts somewhat inactively to the users, that is to say, if we want to really extend our perceptions, we would expect quicker response, which the project lacks.

       In conclusion, the ultimate goal for this project is to give people new experiences on sensing the world and get people to think about how their senses work.