Refection of Interaction Lab Project
According to Crawford, Interactivity means â a cyclic process in which two actors alternately listen, think, and speak.â (P5,The Art of Interactive Design) From his definition, we think that the interaction has to be input (listen), processing(think)and output(speak).
One project I researched is âThe Light Barrier, Third Edition â Drawing volumes in the air with lightâ, in which they built a device that use 3D structured light to reflect the volume in the air. In this case, the input is the volume, the processing is the algorithmic solution of calculating the exact point of the rays, and the output is the great 3D picture from the projector. They used â8 video Panasonic 20,000 lumens WUXGA 4-lamp projectorsâ and lots of mirrors to reflect the light from the projector and build the image through the calculation of the path of every pixel. And the sensor can detect the sound of the environment.
However, a project called âMassive Murals Are Popping Up Around Detroitâ is not regarded as interactivity. Itâs about collecting and making murals on the wall of Detroit. In this project, there is only speaking(output) segment. No input and processing section is involved.
In our project, we imagine a watch which can use sensors to monitor the condition of the usersâ heartbeat , temperature , chemical, breath rate mood ⊠and process the data to predict what food the users want to eat. We named the watch: Watch α.
You can use the watch to check your time, and its battery life is wonderful —it can work for a whole year without charging! Of course, if you want to make the function works better, you should keep wearing it to monitor your condition all the time. Its AI module enables it to learn the appetite of the user and make food in advance and exactly at the time when you want to eat something: it can think what you think. The food is already done when you want something. In our presentation, we chose three scenarios: Homesick, sick, and the date night. In the first case, the device put up of voice input and some other data and find Ruben is homesick, so it made hamburgers and chips, which is the typical food of Rubenâs hometown. In the second case, the device mainly picked up the body temperature of the user and knew that the user is sick. So it searched for proper food for a patient. It chose chicken soup for the user. For the third case, it took in the tone of the voice and the date that is typed in beforehand.
During our discussion, we had changed our idea lots of times. Initially, I came with the idea of a flexible device inspired by MI Alpha(a concept phone by Xiaomi). In my mind, it’s a device whose screen can be flexed. The largest form of it is a really thin laptop or a pad(like iPad Pro3 or Surface) that can be a substitution for laptops and computers. You can use it to work. Then it can be folded into a form of phone, which can be fit into your pocket. The smallest version of it is a form of watch that can “read the users’ mind”. However, my other group members thought the concept is too complexed and cannot be easily expressed during the limited performance. So we simplified the idea into a watch that can monitor the user’s condition and make some adjustions through that data.
After our performance, we got some feedbacks about our emphasis is on the kitchen that can cook anything at any time. We found we surely didn’t tell the audience that our watch is, in fact, collecting data all the time and make decisions all the time so the meals are cooked in advance. We need to show this to the audience.
Work Reference
- Crawford, âWhat Exactly is Interactivity,â The Art of Interactive Design, pp. 1-5.
- Filip Visnjic, The Light Barrier, Third Editio-Drawing volumes in the air with light(https://www.creativeapplications.net/vvvv/the-light-barrier-third-edition-drawings-volumes-in-the-air-with-light/)
- Antwaun Sargent, Massive Murals Are Popping Up Around Detroit,(https://www.vice.com/en_us/article/qvv3qp/massive-murals-detroit-library-street-collective)