Interaction is a process in which two or more actors use a shared language (or method of input) to recognize and react to each other. This definition of interaction is based on the one given on page four of Crawford’s “What Exactly is Interactivity?.” However, I have changed it slightly to include the idea of a shared language. This concept is based on page 19 of Igoe and O’Sullivan’s “Introduction to Physical Computing,” in which the authors include an image of how a computer “sees” us. I believe this concept is important to recognize because how a computer interacts with our bodies is a key part of successful interaction. One additional concept I would like to mention is the idea of interactivity lying on a spectrum, as Crawford’s “What Exactly is Interactivity” proposes. The more recognition that occurs between the user and device, the more interactive the device feels.
One project that I believe does not fully satisfy this definition of interaction is the “Terra Mars” project by SHI Weili. This project uses AI to map the Earth over the terrain of Mars. This AI is advanced, but does not truly communicate with humans. Rather, it simply translates information into a complete image of a planet. Although humans can toggle the map’s features, the two do not operate on a shared scale or “language” (in this case, the user and project do not interact based on the human’s senses, but rather only through the device’s controls).
One project that I believe comes closer to this definition of interactivity is Hayeon Hwang’s “Expressive Tactile Controls.” This project features communication between a user and a series of switches. Depending on the switch, the different states of the user get a certain response (for example, the “Impatient” button needs only the slightest touch to be triggered. The variety of reactions the user and switches can create fits the idea of recognizing/reacting that characterizes interactivity.
https://www.creativeapplications.net/member-submissions/expressive-tactile-controls/
Our project, the Dreamcatcher, is based on this idea of interaction. On the Dreamcatcher’s end, the device recognizes images coming from the user and knits the information together into a complete dream that the user can then peruse. The element of the project that I believe best exemplifies its interactive capability is the Dreamcatcher’s ability to expand dreams into a full narrative based on the user’s likes and dislikes. In this way, the more the Dreamcatcher learns about its user, the more it can cater its output to the user’s individual desires. For this definition of interactivity, more mutual recognition (“shared” language) occurring between a device and its user means the interactive capability is stronger.