Final Project: Documentation

By Gabrielle Branche

Synopsis:

My final project was to create a fully functioning moth that holistically worked.  My report explains the process and has the code for my final. See report here. Photos and Videos of the moth at different stages of its development can be found here.

Reflection:

This final project for me served as a way to tie together all the aspects of Bio-inspired Robotics that was learnt in this class. Firstly I wanted to perfect the behavior of my moth as we addressed that intelligence is a large factor in making a machine a robot. Although my robot did not have artificial intelligence, I hoped that my robot could respond to its environment with the use of hard code. 

Secondly I looked at locomotion. While my robot still had wheels I aimed to explore its true movement bu looking at the flapping of its wings. I also tried to tie in what I learned in my other class by lazer cutting and manually building the wings. This way my robot could be more authentic and go beyond behavior

Finally had time allowed I would have used to webcam to have object recognition but this is now an extension and improvement should I decide to develop this bot further. 

This has been a great learning experience for me, starting with questioning what makes a robot a robot to the details of it such as using functions to polish and increase the robustness of my code. Most importantly I learned that to truly get into bio-inspired robotics, observation research and prototyping and the key steps to building something that has the potential to be great. 

BIRS Midterm Documentation

Assignment Overview

The goal of the midterm project was for us to create robotic implementations of a living organism. We had to choose an animal behavior and program our Kittenbots to mimic that behavior.

Animal Behavior

In my research, I looked at a variety of different insects because I was fascinated by their movement patterns, from ant farms to synchronous fireflies. I finally decided to model my project after termite behavior, specifically the termite ink experiment. I found it fascinating that termites communicate solely by smelling and secreting pheromones, which can be manipulated by drawing a line of ink on a surface.

Programming Process

My plan of action was to make my Kittenbot wiggle along a line, which is a seemingly simple task, but I did experience challenges along the way. I chose a simpler behavioral concept because my goal was to make sure that I have a working project, and also become comfortable with programming the Microbit, and Robotbit. 

During my first trials, my robot would not recognize the line correctly and spin around in circles instead. It also moved backwards. This confused me because I had downloaded the code online.

https://youtu.be/W_mZL39oZ88

However, I then took the time to tinker with the code on the MakeCode editor, and test it out with each change I made. This really helped me become more comfortable with programming, and understand how each component worked together. I also had to consider the track I was creating for the robot. I build my line track out of electrical tape. My first iterations had problems because I did not build a proper track. I had to make sure that the track was wide, with obtuse angles at the turning points. I also had to ensure that the entire track was on one surface, with an adequate contrast between the tape and the surface.

Final Product

Below is the finished project:

https://youtu.be/u5Bpz235uis

Midterm Project: You Got Any Light

By: Gabrielle Branche

Synopsis:

For my midterm project I decided to explore the behaviour of a moth. After doing research I realised that the Greater Wax Moth had very distinct behavior. The females after mating stayed in the hive and oviposited in the hive. However as with most moths, the moths while not drawn to light are attracted once exposed to bright amounts of light.  As such I tried to simulate these behaviours by using a light sensor, an IR sensor, an ultrasound sensor and neopixel lights.

Inspiration:

I initially looked at Braitenberg’s creatures and was interested in Paranoid. I liked the way it got trapped in a loop of wanted to explore but then getting scared. I was facinated about seeing what patterns it would make if it were placed in a spotlight and had to make it’s way back to the light from the darkness. However once actually executing this I noticed that it actually just circled around the light. When discussing with classmates, we thought about how the fact that it was drawn to the light was similar to that of a moth. From there the process of making a moth developed.

The Process:

The first code I worked with was the light sensor. I programmed the kittenbot to circle a bright light source by making it circle below a certain light threshold. (See #bright)  It is not drawn to the light but rather gets trapped in the bright light source since the sensor is place directly in the centre of the bot. Once the light source is removed, it continues on it’s way.

After I worked with ultrasound to have the bot responsive to objects. At first I wanted the objects to represent food sources, however according to Nielson, the greater wax moth mates at least once a day (Nielson, 1977). As such I decided to have the objects represent mating locations (cites of you male moths). Similar to the photo sensor, the code runs only below a certain threshold when the ultrasound is within close range of an object. 

The ultrasound was at first difficult to work with because the range was very difficult to manipulate. I realised that part of the reason was the fact that when nothing is in range the ultrasound reads 0. However after changing the code to accommodate this (see #ultrasound) it worked better. Nevertheless it is still prone to stopping randomly and carrying out the Ultrasound module of the code. The code would need to be made more robust to avoid these outliers.

I used the neopixel to represent fertilization. After mating, the bot would produce eggs – as shown by the pink lights at the back of the bot. Once the moth is fully fertilised, the lights turn green signifying that it can now move about the hive. While moving about it looses eggs due to ovipositing. As such, as the bot moves around the lights drop at a random rate. Only when it has finished ovipositing will it detect other mating areas else it would not detect them.

To develop this project further I can distinguish between objects such that it can avoid all objects unless ovipositing is completed in which case it will stop at stations. However that would require more detailed code and potentially machine learning to execute.

Finally, the code for the barrier. Since fertilized moths remain the hive while male and virgin moths go out in the early morning and at night, my intention was to have to bot be constrained within a box (the hive). The code works by itself (see video 3). However once put into the full code it no longer runs. This may be due to the fact that the neopixel has sleep functions (see #ovipositing) which was preventing the the IR sensor from being read. Still this should not be the case since the ultrasound and light are not affected and I programmed ovipositing to occur very quickly to limit sleep time. I hope to debug this in the future.

Reflection:

This project was enjoyable to work on and I believe I have learnt a great deal. There were many moments when I was very frustrated but holistically it helped me grow as a coder and better understand animalistic behavior. I believe that it has the potential to become a robust robot that can actually simulate a moth in it’s entirety, although flight bay be a challenge. The next step after debugging would be increasing it’s awareness of it’s environment and artistically speaking giving the robot a more moth-like appearance.

Videos:

The videos are too large, see videos here

Code:

Below is the final code combined:

from microbit import *
import robotbit
import time
import random
import neopixel

# Setup the Neopixel strip on pin0 with a length of 8 pixels
np = neopixel.NeoPixel(pin16, 4)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()
sleep(random.randint(0, 3)*1000)


while True:
border = pin1.read_analog() #determine light frequency
dist = robotbit.sonar(pin0) #determine object distance
light = pin2.read_analog() #determine light intensity

#light sensor
if light < 50: #bright
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)

#IR Sensor
elif border < 200: #black
robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(2000)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 5)*1000)

#ultrasound sensor
elif dist < 50 and dist != 0:

robotbit.motor(1, 0, 0)
robotbit.motor(3, 0, 0)

np[0] = (0, 0, 0)
np[1] = (0, 0, 0)
np[2] = (0, 0, 0)
np[3] = (0, 0, 0)
np.show()
sleep(500)

np[0] = (255, 0, 128)
np.show()
sleep(1000)

np[1] = (255, 0, 128)
np.show()
sleep(1000)

np[2] = (255, 0, 128)
np.show()
sleep(1000)

np[3] = (255, 0, 128)
np.show()
sleep(1000)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()

robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(500)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 3)*1000)


else: #ovipositing
robotbit.motor(1, -95, 0)
robotbit.motor(3, -90, 0)

sleep(random.randint(0, 10)*100)

np[3] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[2] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[1] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[0] = (0, 0, 0)
np.show()
# sleep(random.randint(0, 10)*100)

References:

Lilienthal, A. and Duckett, T. (2004). Experimental analysis of gas-sensitive Braitenberg vehicles. Advanced Robotics, 18(8), pp.817-834.

Nielsen, R. and Brister, D. (1977). The Greater Wax Moth:1 Adult Behavior2. Annals of the Entomological Society of America, 70(1), pp.101-103.

Midterm Proposal

By: Gabrielle Branche

For my midterm I plan to explore the behavior of a moth using Braitenberg’s creature paranoid. My moth should be able to follow light and then circle around the light source. Once I perfect this I will try to expand my project such that my moth can find food. This will be simulated by my moth detecting ‘flowers’ staying by them and then turning away from them, unless there is a light source in which case the original code will take priority