I am Not a Robot // BIRS final project documentation by Konrad Krawczyk

idea:

For my BIRS final, I wanted to explore the area of computer vision in robotics. After having worked with robots using MicroBit and Arduino, I decided that it’s time to take on something a bit more advanced, namely implementing machine learning algorithms into robots themselves in order to make them perform advanced analytical tasks. Since the beginning, I knew this would require me to learn how to use Raspberry Pi, as it is the only full-fledged computer among the small chips commonly used for robotics.
Ideally, I wanted my robot to be able to recognise itself in the mirror.

background:

 

The importance of visual self-recognition is outlined by Jacques Lacan in his Mirror Stage theory. The concept of the mirror stage is based on the belief that infants can recognise themselves in the mirror an that moment creates a powerful shift in self perception. According to Lacan, the moment of recognition of oneself as an object is a moment that adds a crucial bit of self-awareness to develop socially, and plays a role in ego formation. Aside from humans, the act of recognising oneself in the mirror has been observed among certain species of monkeys.

I would like to look at the symbolic meaning of this moment by applying a seemingly similar mechanism to a robot. This ind of self-recognition would be very different from how infant mind works, in a sense that the robot would have a pre-existing amount of object recognition intelligence already, while infants are developing it as they go. However, this could be a critical and maybe somewhat ironic take on the importance of the “mirror stage”.

process:

I started with getting the Raspberry Pi 2. I had very little prior experience working with it, so I tried to install the NOOBS operating system. This, however, has failed after several attempts due to a mysterious data formatting issue.

I picked up the RPi 1 with opencv included in it. It quickly turned out that there was not enough disk space, therefore I decided to delete an unrelated tutorial file in a .pdf format. This caused the RPi to freeze, unfortunately. I will need to bring it back to usability ASAP.

However, this way I found out that there was an easier way to start working on RPi2, namely Raspbian Stretch Lite – a headless, command line-only operating system. I burned the image of it onto an SD card and proceeded to try to get the Internet to work, learning as I went through it. After half a day of struggle, I finally connected to the NYU Wi-Fi. I installed a list of packages, then got the database corrupted after reboot, then reinstalled the system, then installed packages again, and finally tried to get the camera footage. However, I didn’t get to that point eventually. I had to reconnect to my own hotspot, after which the database got corrupted yet again. Then, I had to reinstall the system again, reinstall the packages, and try to get the camera footage to work with the common wi-fi. Even though I followed the tutorial step by step, at the end there has always been a problem with a camera move.

However, I did get to run some basic AlphaBot sketches on the robot setup.

joystick control:

obstacle avoidance:

I truly, deeply wish I could accomplish this task by the allocated time. However, I am determined to continue working on this experiment, as I got very interested in making Web-embedded physical robots. I will continue working on it for as long as I am in Shanghai, or later next semester.

Leave a Reply