The Idea
Our group met twice before the performance, once online via zoom, we initially discussed which story and fictional scenario we wanted to do. Among the three fictions, we decided to choose the third one which is The Plague written by Yan Leisheng. In this fiction, the city has a very susceptible virus, infected people will be gradually petrified in a very short period of time. But after they are completely petrified they are still alive, they just become a concept of time and its slow new life form. There will be a month between the time they are first noticed to be infected and the time they become fully petrified. In light of the virus or pandemic, we reasoned that the city needed to be cleaned immediately. So first we wanted to make a cleaning machine, which looks like a tank and is able to automatically spray cleaning drops or something. But then we thought it was not interactive enough, because basically, that was pretty much a watering machine, which we could easily make today. And since we were not allowed to use any electricity, it was impossible to make an automated robot. After a little more discussion, we came up with an idea which was letting a person play the robot and we just needed to design the functions and appearance of the robot. And considering how to take better care of human beings ourselves, we changed our idea to make a kind of hospice robot. The robot could provide help to people who were infected and being gradually petrified, by identifying the instructions given by people.
Robot Making Day!
The second meeting of our group was on October 4, and we chose this day to make this robot out of cardboard and discuss the details of the robot’s functions. Here’s the sketch of the robot our fantastic group member Andy drew. We first found some cardboard in the corner of room 826, and we divided the work. Sarah and Smile offered to make the glasses. They measured the width of Smile’s face to make the glasses fit her face perfectly. Chaoyue and I were going to make the arm and leg parts. It wasn’t very hard, but we needed to make several pieces to cover her arms and legs well. Andy and Jason went to do the helmet part! They were seriously thinking about the construction of the helmet… During the robot-making process, problems appeared inevitably. First, what was each part for? At that time, we only had a relatively brief idea and a sketch. Though we designed many parts, we had no idea what the detailed functions were. And how to make the robot really interactive? That was also something important. Therefore, we listed the crucial parts of the robot and discussed what they can do specifically. After the discussion, we came up with three mean functions of the robot.
The helmet: There’s a detector installed on the helmet so the robot is able to detect the virus, so when the robot is operating on the streets in the city, it will easily find people who are infected and go help them.
The glasses: The robot wears a pair of glasses, and the glasses can identify peoples’ instructions. For example, a 30% petrified person will use his or her arms to give instructions. And a 70% petrified person will move the head to show the robot the command. As for the 98% petrified person, their eye moves will be identified by the robot’s glasses.
The scanner: The scanner was connected to the robot’s arm. After detecting the infected person, it will use the scanner to identify the level and percentage of petrification to provide specific assistance. During the whole process, we also encountered problems such as poor hot melt fixing, making things in the wrong size, too limited materials, etc. But since they were only the simplest technical problems, they were quickly solved 🙂
The robot was finally done! Let me then explain in detail the settings and functions of this robot! In the scenario of the third fiction, the whole city is extremely susceptible to a virus and can turn into stone people in a month. So we designed an interactive robot, it will patrol the city, through the sensor on the helmet to sense the infected people. It would then scan the infected people to determine their level of petrification. Next, it will become a hospice robot for the infected person, it will ask the person what help they need, and the infected person will give the robot some instructions through gestures, head twisting, eye blinking, etc. After receiving the instructions, the robot will confirm and then execute them. After the person is completely petrified, the signal of complete petrification will be scanned by the robot, which will then transfer the person to the crematorium.
So how does this robot relates to my previous research and why it fits my definition of interaction?
As I wrote in my previous blog, I view “interaction” as a circular process of “communicate, receive, react.” Just like the process of “listen, think, speak,” One side has to convey a message, and the other side receives and reacts to it, and this cycle is the process of interaction. Our robot is very interactive. It will scan the infected person and determine the infection level based on the person’s infection signal. More importantly, it will respond to the person’s instructions and interact with the person’s needs and commands.
Time to perform!
After finishing the cardboard part of making the robot, we considered how to perform and create scenes that the audience will easily understand. To let every group member be involved in the performance, and to better show how the robot identifies the infected level and provides specific help by recognizing different instructions, we decided to create 5 scenes. Sarah, I, Chaoyue, Andy, and Jason, the five of us played the role of the infected from mild to severe infection, and Smile played the role of the robot to scan us separately and provide us with help. Sarah was the least infected and hadn’t started petrifying yet, so the robot identified no help was needed. My legs were petrified, so I couldn’t move, and the bot judged me as 30% petrified. Chaoyue was more petrified and only her head could be moved, so she turned her head toward the window. Jason was identified as being in a completely petrified state and was transported directly to the crematorium.
Here’s our performing video!
And of course, this robot has its own advantage, as well as limitations. Our robot has a complete process of finding the infected, offering help, and ending help, and is also very interactive, while fitting in well with the fictional scenario. But it also has its own shortcomings, as Professor Minsky commented after our performance, the robot can provide very limited help for the infected. We could probably think of something more functional or more interactive.
Critical analysis and assessment of “Group Team”‘s project! This group chose the first fiction, the scenario is a very intelligent society, and in the “happy house” everything is interactive. So they designed a sensory interactive device. Wearing this device, the person can immediately come to another world, he can hear, see, smell, and touch anything in this virtual world, and he will have a very real virtual world experience. Their group’s performance was also very vivid. The student who put on the device performed a scene of coming to the beach, he felt the sea breeze of the virtual world, saw the beautiful sea, touched the seawater, and climbed the coconut tree. Their performance vividly showed the multiple functions of this device. I think their device does fit the setting in the fiction because it’s really intelligent. It also fits the requirements of this assignment, because the person and the device create a connection and interaction. But I think they also have shortcomings, for example, at first I think the device and VR devices are too similar. As well as the fact that people do not give the device too much interaction, basically, it’s always the device bringing people experience. These are my brief comments, but I still really like their creativity and performance!
Here's our full script (Simplified) Role: Robot: Smile 1st stage 1 day: free to move - Sarah 2nd stage: only upper body can move - Jessie 3rd stage: only head - Chaoyue 4th stage: only eyes - Andy 5th stage: completely solidified - Jason Script: Stage1 diii…. Robot: Virus infection detected. Target Identified bibi.. Silicon Percentage:5% Body status: Free to move. No service needed. 1st: freestyle* Stage2 diii : Virus infection detected. Target Identified bibi.. Silicon Percentage:30% Body status: Legs parralyzed. What can I do for you? 2nd: shout and command verbally then lastly point to a tissue Bi receive instruction Robot: You want the tissue. Option 1- bring you the object; Option 2: bring you to the object 2nd: show 1 with finger Roger. Mission completed. Stage3 diii Robot: Virus infection detected. Target Identified bibi.. Silicon Percentage:70% Body status: Paralyzed from shoulder down. What can I do for you? 3nd: command verbally then turn your head and look at the window Action Identified. Would you like to be transported to the window? Yes, nod your head. No, shake your head. 3nd nod yes Roger. Mission Completed. Stage 4 Dii Robot: Virus infection detected. Target Identified. Bibi.. Silicon Percentage:98% Body status: Only eyes can move. What can I do for you? 4nd: Move your eyes left for 10 sec Detected Pupil Movement. Eyes looking toward left more than 10 seconds. Would you like to be transported to the left? Yes blink once, no blind twice. 4nd blinks once Roger Mission Completed. Stage 5 Robot: Dii Robot: Virus infection detected. Target Identified bibibibibibi Body Status: Completely solidified. Transferring to petrifaction community. Mission Completed. See you. THE END.