Conception and design:
I wanted to explore how interaction came to be when users came across an object in the middle of a room, drawing inspiration from works that explored interaction in public space (such as ‘Marbles’ and ‘Dune’ by Daan Roosegaarde), which had the device light up in different ways depending on how they interacted with passersby. Dune, for example, lit up where the passerby was passing through, and Marbles lit up when passersby touched or got close to one of their devices. I chose to base myself off of ‘Marbles’ and create a similar scaled down version. Instead of an irregular shape, I just used a hollow dome, since a circle would allow me to place capacitive touch sensors equidistantly, which fed values that became the hue of an LED strip. I first planned to produce only two audio outputs from processing (after I was done with making the device before user testing): the sound of a crowd and white noise, since they would symbolize chaos and tranquility respectively, acting when the lights were chaotically changing colors (as the capacitive touch sensors would detect very different values that became really unstable). The unstable values somehow fixed themselves before the presentation, which allowed me to get the results I wanted (colors transitioning smoothly). With this color output, Rudi helped me code a multiple-value serial communication that had two sensors’ values affect the frequency of a sine wave and the other two affect the amplitude, leading to a device producing a sine wave that would actually look and sound like it calmed down. The dome was 3-d printed in white PLA and was glued to a white laser-cut circle, an Arduino Uno was used to house the program, with 4 capacitive touch sensors and an LED strip connected to it. A capacitive touch sensor consists of a resistor connected to a wire wrapped in a piece of copper tape, which detects changes in an electric field a few inches away from it. Another option for the dome were white acrylic bowls, which I tested my device with but ended up not using because they did not diffract the light as well as the 3-d printed dome. The first video was the first iteration of the device, before sound was added. The second video is the final device, after the sinewave from processing was added.
Fabrication and Production:
In the process of 3-d printing the dome, the filament broke twice, so I had to change the printer and filament. The dome finished printing in time to be incorporated with the circuit for the user test. Regarding the circuit, I elected to use sensors that gave an unpredictable range of values and unstable readings, which made it difficult for me to find a maximum value to map. I ended up constraining the values between 0 and a value I estimated was high enough, and mapped that same range to be from 0 to 255 so that they could ‘feed’ a color to the LED strip. The LED strip was configured to work in HSB so that the hues flowed smoothly. During the first part of the project, the values obtained by the sensors were so unpredictable that the dome ended up changing colors almost randomly. Because of that, I ended up programming the sound output to be the crowd and the white noise. However, during user testing the main feedback I received was that in doing so I turned the device into an on/off switch for two different sounds and killed the sensation of the light calming itself down. After the user test I found that the values obtained by the sensors had become stable, which allowed me to constrain and map them more accurately, having a device with colors that would actually flow smoothly. Then Rudi helped me code a sine wave whose frequency and amplitude were affected by the values obtained by the different sensors, decreasing along with the distance between the user’s hand and the device. Users then stated that they felt like they were actually calming down the device, which prompted me to record a video of myself interacting with the device (shown below) which I later showed during my presentation, since it was the only time the device worked the way I wanted it to work. After that, the device started changing colors chaotically again. During the testings I conducted with this second version of the device, I tried to tell the users the least about what they had to do with it, but I eventually had to give them some information about it. I think it may be because it was in a classroom setting instead of in the street or the hallway, which would have allowed the users to interact with the device more freely. However, I believe I was able to see how the users discovered what the device did, albeit after a little guidance.
Conclusion:
My project aimed to find out how interaction came to be between a user and a device in the middle of public space. Due to the 3-d printer only being able to print small objects, I changed from ‘public space’ to ‘tabletop’, as it was, essentially, how my project would be laid out. The way users interacted with it led me to think that I would be able to investigate my question further with an installation in the middle of the street, which is what I would do if I had the resources and time. I believe the results of my project align with my definition of interaction because it is based on the processing of an input turned into an output, between a human and a device. If I had more time, I would attempt to improve the design of the dome that houses the device and give it an irregular shape, similar to those found in nature, like a rock, and use more accurate sensors instead of makeshift ones made out of stripped wire, as well as have a bigger variety of sound produced that made it seem like the ‘rock’ was alive, which would prompt the user to interact and discover more with it. In conclusion, I believe this project was successful in letting me observe how interaction comes to be, but there is still a lot of room for improvement that could let me answer the question better, especially when it comes to users discovering the functions of the device.