To me, I understand interaction simply as the process of performing and action and getting some form of feedback in response. This goes for all sorts of interaction, but for machine interaction specifically, the quality of interaction depends on how comfortable the user is with the machine. In this case, the feedback has to be of more relevance to the user for the product to be considered good and interactive. The ability to be able to get a difference in feedback depending on the input you give, and get an accurate response of what you expected the machine to do, is how I personally judge the interactiveness of a machine or program.
The two interactive projects that I looked into were the Pom Pom Mirror and Short++ . Both projects was cool and holds certain value to them, but their interpretation of interaction was very different. The interactions of the pom pom mirror was very aligned to my interpretation of interaction, the user stands in front of the mirror, do any gesture or pose and it will be picked up by the mirror which gives the user feedback of graphics that is changed according to the user, and is also responsive to new changes. The user lifts their hands, expecting to see the mirror react accordingly, and the mirror does exactly what was expected. Short++ ,on the other hand , is a mechanical shoe that can boost the height of the user based on commands given by their phones. I agree on how it can be useful for shorter people to perform tasks that requires height, but despite how good the idea is, this projects interaction does not align with mine. The shoe is controlled by a phone which is used as a remote control in this case, however, a project that truly wants a immersive interaction between the technology and the human, it needs a more human like interactivity. The only reason why people chooses to use a remote is because it is easy to make, and easy to program it so that it is accurately giving the machine input from the user. You simply click a button to give it command, simple and accurate. But if I were to improve this project in a way so that it interacts with the user in a more natural way, it would be to put sensors over the mechanic shoe, and when the sensors gets data that suggest the user wants to lift themselves up, it will perform the action of lifting them up without the need to get phones and buttons involved. This is how I came to conclusion that interaction is supposed to feel natural to the human body and mind, as technologies get better and better, buttons and touch screens should be replaced by sensors that are more human friendly, able to recognize gestures or other ways that humans communicate and interact before the existence of machines.
With this concept in my mind me and my teammates came up with the design of a robot that can deliver water to you, it is also given human like features so that the user can expect to interact with them like interacting with humans. You can speak to it or communicate to it through body language and hand gestures, it can also pickup sounds such as clapping or whistling as these are also common ways that humans use to grab others attention. These features for the robot came directly from the things that I have learned from looking into the two interaction projects. To allow the user to interact with the machine in a natural way, just like how the user will interact with other humans. Giving it the ability to comprehend human action and act responsively.
Pictures of cardboard prototype: