I define interaction as a cyclic process involving action and reaction between two forces. This is based off of the definition of interactivity in “The Art of Interactive Design” (Crawford 3). One example of an interactive project is the “Expressive Tactile Controls” project. An example of a project that is not interactive is the “Terra Mars – ANN’s topography of Mars in the visual style of Earth.”
The Expressive Tactile Controls project fits my definition of interaction because it involves a cyclic process between two forces (the button and the user). The user initiates an action and the button responds with a reaction. The action the user initiates is pressing the button or attempting to press the button. The reaction the button responds with depends on which emotion it is trying to exhibit. Some buttons will shy away to show timidness, others will jump out to show excitement. It is the cyclic process between the user’s interaction with the button that makes this project an interactive project.
The Terra Mars project doesn’t fit my definition of interaction because the project doesn’t involve a cyclic process of action and reaction. It just creates a topographical view of Mars modeled after Earth. The user, humans, just looks at the images it creates. The user doesn’t interact with it in any way; doesn’t perform an action and no reaction is produced. This project and the Expressive Tactile Controls project shows the difference between an interactive project and one that is not. More specifically, one fits the part of the definition requiring an action and reaction and the other doesn’t.
The definition of interactivity in “The Art of Interactive Design” includes that the interaction between the forces should include thinking, speaking, and listening (Crawford 5). But I don’t think that is necessary. The Expressive Tactile Controls is an example of why. When the user presses the button, the mechanism automatically responds, there are no thinking involved. Although one might argue that when the button senses the user it can be considered listening, and when the button reacts it can be considered speaking. But that all depends on how broad everyone’s definition of “listening” and “speaking” is.
The project my group designed, “iMirror,” is an interactive design because it contains an AI capable of interacting with the user through speech, which is an automatic form of action/reaction because it is a cyclic process where what one force speaks gets an answer from the other. The AI can hold a normal conversation with the user, give the user skin care advice, and answer any questions the user may have. As such, iMirror is an interactive project.
Terra Mars – ANN’s topography of Mars in the visual style of Earth