In project 2, my partners Yiting, Sarah, and I built a needy robot together. The concept and result have evolved a lot since we first got this idea. Compared to the first project, we focused more on playtesting and user feedback, which was a great pusher for our project development.
Concept and prototype
Initially, we had the idea of showing a machine recognizing and forgetting the humans who approach it.
So for the first prototype, we used a laptop connecting with a distance sensor and a webcam that takes pictures once someone approaches and blurs out the image as one goes away.
From the observation and feedback from the playtest, we discovered –
- How explicitly we place the distance sensor determines how people react to it, which in this case, we do not want users to focus or even notice the sensor.
- Building a storytelling context might be helpful for users to understand and expect from this robot.
- We still need to decide what happens after the photo is taken, which is our main focus on the playtest – to see how people react and observe their anticipations.
Besides these notes, the users’ movement also inspired us. They all tried to wave at the camera, and that prompted us to create a robot with one hand up, suggesting a welcoming attitude to interact with.
Process and fabrication
In the following weeks, we fabricated the robot, revised the code in p5.js, connected a Go-Pro camera to the computer, and built our storytelling.




Presentation


It seemed that the form of a robot was a lot more relatable for people. It was interesting to see how quickly the users had compassion for the robot. Also, we decided to add a red heart emoji on the screen as the starting state, which ended with a broken heart if the user left, to give the users a sense of start and finish.


Big Thanks
Big thanks to my partners Sarah and Yiting! I’m so grateful to have a chance to conduct solid user testing together. I learned a lot from both of them. And also, thank you to the people who participated in our testings!