PART A:
I was especially intrigued by the various emotions displayed by the Braitenberg Vehicles, despite being constructed from a few basic sensors. Vehicles like “Fear” and “Love” were the starting point for my inspiration; I enjoyed how they utilized simple processes such as straying away from a source to depict fearfulness, or orienting themselves towards a source that they find attractive. Because of this, I wanted to utilize the bot’s ultrasound sensor, as I thought that this would be the most versatile sensor in terms of detecting the surrounding environment.
In terms of emotions, I wanted to imbue my robot with characteristics of curiosity and cautiousness. These two emotions complement each other quite well, and are distinct markers of an active, ‘intelligent’ organism. Curiosity will drive the bot to approach certain objects, while cautiousness will inhibit curiosity, making sure the bot keeps a certain distance from said object. To begin, I quickly created some pseudocode for these behaviors:
#while True:
#variable = ultrasound sensor distance
#display the distance value
#if variable >= desired dist:
#display happy face
#move motor 1 forward
#move motor 2 forward
#delay
#if variable <= desired dist:
#display sad face
#move motor 1 backward
#move motor 2 backward
#delay
Despite having a simple code makeup, I feel that it captures ‘curiosity’ and ‘cautiousness’ quite well. Both behaviors are stored within the while loop, and the ultrasound detected distance will be stored within a variable. If the robot is far enough from an object, it will display a happy face, to show that it is content, and move forward towards the object, hence the ‘curiosity’ characteristics. However, once it nears the object, and passes the distance threshold, it will display a sad face, and move backwards, enacting the ‘cautious’ approach. Therefore, if it approaches a human, it will continue to move backwards if the human keeps moving closer, and forward if the human backs away from the robot. This way, there will always be a certain distance between the bot and the human, regardless of whether it is moving toward or away from the user. If faced with an immoveable object, the bot will be stuck in a loop of moving forward and backwards in the same axis.
The actual code:
from microbit import *
import robotbit
import time
while True:
dist = robotbit.sonar(pin0)
#display.scroll(dist)
if dist >= 30:
display.show(Image.HAPPY)
#robotbit.servo(90,0)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 100, 0)
#sleep(2)
if dist <= 25:
display.show(Image.SAD)
robotbit.motor(1,140,0)
robotbit.motor(3,-180,0)
#sleep(2)
I first wanted to test whether or not the ultrasound sensor was working. It was a struggle to find the perfect distance threshold, but finally, I managed to get it working, and display the LED facial expressions accordingly.
Then, the next step was to have the motors move either forward or backwards according to the distance:
After completing the LED display and motor steps, I tested it on an immovable object to see if it reacted the way I expected it to:
As you can see, when faced with objects that do not move, the bot will be stuck in a perpetual loop of moving back and forth, which is exactly what I had expected. Finally, I tested it myself by moving my foot closer or away from the bot:
From the video, you can see the bot stutter when I quickly move my foot in front of the sensor, which I initially wanted to get rid of, but after further interaction, I actually really liked it, as I thought this quirk made the bot’s behavior more ‘realistic’, like it was indecisive about whether or not to approach me. Testing it with my hand was even more fun; as I reached out to it from far away, it happily approach me like a puppy, but once it neared me, it backed away hesitantly, as if unsure about the situation.
I am quite happy with the results.
PART B:
One of my absolute favorite artists, Simon Stalenhag, happens to create work based off of human-technology interaction. Simon utilizes ‘digital painting’ to create his pieces, and he draws inspiration from the Swedish countryside, where he is originally from.
Some select pieces:
What I really enjoy about Simon’s work is that he is able to capture the biological, almost human-like qualities of machines. The way the technology is set against the natural backdrop of the countryside is very intriguing; on one hand it seems a bit unnatural and unnerving to see giant machines wander across the mountains, but on the other hand, there is something truly captivating about humans (flesh) integrating technology (artificial) into their own bodies and lives. In a way, Simon pushes the viewers to question whether or not sentient machines are truly ‘alive’, or if technology will always require human life to exist.
With that in mind, I thought about instances where the emotions I chose for my bot appear in real life. In a way, I feel that most complex creatures are ‘pre-programmed’ to be both curious and cautious, as these traits were necessary for organisms to grow and evolve over time. Animals need to be cautious in order to avoid potential dangers and keep themselves alive in order to pass on their genes to the next generation. However, these organisms also need to have a sense of curiosity, which enables them to act upon their surrounding environment and discover new survival techniques that allow for adaption and accommodation, ways of thinking that can benefit the entire species.
From this video, we can see that the raccoon exhibits both cautiousness (wary of the stranger) and curiosity (desire to grasp the ration and move closer to the stranger). There are countless instances of various types of animals engaging in this behavior, such as squirrels darting away from a passerby who wanders too close, or monkeys who gather around onlookers to see if fruit is available.
In terms of changing my code, I initially had the kitten head servo move side to side continuously, as I wanted to add more interactive behavior. However, I realized that since the head was essentially the ultrasound sensor, this would add unnecessary complications to how the bot reacted to incoming stimuli. For example, if an object were to appear in front of the bot, it would detect its presence, and move back. However, if the head were turned to the side, it would not see the object in front of it, and continue moving forward, colliding with it. As a result, I cut out the servo movement and left the ultrasound in a default position of facing forward. Therefore, the bot would continuously be reacting to whatever it was facing. In the future however, I would like to add multiple ultrasound sensors to the perimeter of the body, so that the head can move freely to express different emotions/states, while still being able to detect the surrounding environment accurately.
Ultimately, I felt that this lab went better than the previous ones, partially because using the Python editor was much faster than the ‘block code’ technique we had to use previously. I really enjoyed using the ultrasound sensor, and I hope to be able to experiment with more sensors (photocell, sound, etc). One interesting aspect I hope to expand upon would be imbuing the bot with more behaviors, such as moving the servo head while still being able to detect oncoming stimuli, and utilizing the light sensor to create different interactions in varying light levels.