Midterm Project: You Got Any Light

By: Gabrielle Branche

Synopsis:

For my midterm project I decided to explore the behaviour of a moth. After doing research I realised that the Greater Wax Moth had very distinct behavior. The females after mating stayed in the hive and oviposited in the hive. However as with most moths, the moths while not drawn to light are attracted once exposed to bright amounts of light.  As such I tried to simulate these behaviours by using a light sensor, an IR sensor, an ultrasound sensor and neopixel lights.

Inspiration:

I initially looked at Braitenberg’s creatures and was interested in Paranoid. I liked the way it got trapped in a loop of wanted to explore but then getting scared. I was facinated about seeing what patterns it would make if it were placed in a spotlight and had to make it’s way back to the light from the darkness. However once actually executing this I noticed that it actually just circled around the light. When discussing with classmates, we thought about how the fact that it was drawn to the light was similar to that of a moth. From there the process of making a moth developed.

The Process:

The first code I worked with was the light sensor. I programmed the kittenbot to circle a bright light source by making it circle below a certain light threshold. (See #bright)  It is not drawn to the light but rather gets trapped in the bright light source since the sensor is place directly in the centre of the bot. Once the light source is removed, it continues on it’s way.

After I worked with ultrasound to have the bot responsive to objects. At first I wanted the objects to represent food sources, however according to Nielson, the greater wax moth mates at least once a day (Nielson, 1977). As such I decided to have the objects represent mating locations (cites of you male moths). Similar to the photo sensor, the code runs only below a certain threshold when the ultrasound is within close range of an object. 

The ultrasound was at first difficult to work with because the range was very difficult to manipulate. I realised that part of the reason was the fact that when nothing is in range the ultrasound reads 0. However after changing the code to accommodate this (see #ultrasound) it worked better. Nevertheless it is still prone to stopping randomly and carrying out the Ultrasound module of the code. The code would need to be made more robust to avoid these outliers.

I used the neopixel to represent fertilization. After mating, the bot would produce eggs – as shown by the pink lights at the back of the bot. Once the moth is fully fertilised, the lights turn green signifying that it can now move about the hive. While moving about it looses eggs due to ovipositing. As such, as the bot moves around the lights drop at a random rate. Only when it has finished ovipositing will it detect other mating areas else it would not detect them.

To develop this project further I can distinguish between objects such that it can avoid all objects unless ovipositing is completed in which case it will stop at stations. However that would require more detailed code and potentially machine learning to execute.

Finally, the code for the barrier. Since fertilized moths remain the hive while male and virgin moths go out in the early morning and at night, my intention was to have to bot be constrained within a box (the hive). The code works by itself (see video 3). However once put into the full code it no longer runs. This may be due to the fact that the neopixel has sleep functions (see #ovipositing) which was preventing the the IR sensor from being read. Still this should not be the case since the ultrasound and light are not affected and I programmed ovipositing to occur very quickly to limit sleep time. I hope to debug this in the future.

Reflection:

This project was enjoyable to work on and I believe I have learnt a great deal. There were many moments when I was very frustrated but holistically it helped me grow as a coder and better understand animalistic behavior. I believe that it has the potential to become a robust robot that can actually simulate a moth in it’s entirety, although flight bay be a challenge. The next step after debugging would be increasing it’s awareness of it’s environment and artistically speaking giving the robot a more moth-like appearance.

Videos:

The videos are too large, see videos here

Code:

Below is the final code combined:

from microbit import *
import robotbit
import time
import random
import neopixel

# Setup the Neopixel strip on pin0 with a length of 8 pixels
np = neopixel.NeoPixel(pin16, 4)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()
sleep(random.randint(0, 3)*1000)


while True:
border = pin1.read_analog() #determine light frequency
dist = robotbit.sonar(pin0) #determine object distance
light = pin2.read_analog() #determine light intensity

#light sensor
if light < 50: #bright
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)

#IR Sensor
elif border < 200: #black
robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(2000)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 5)*1000)

#ultrasound sensor
elif dist < 50 and dist != 0:

robotbit.motor(1, 0, 0)
robotbit.motor(3, 0, 0)

np[0] = (0, 0, 0)
np[1] = (0, 0, 0)
np[2] = (0, 0, 0)
np[3] = (0, 0, 0)
np.show()
sleep(500)

np[0] = (255, 0, 128)
np.show()
sleep(1000)

np[1] = (255, 0, 128)
np.show()
sleep(1000)

np[2] = (255, 0, 128)
np.show()
sleep(1000)

np[3] = (255, 0, 128)
np.show()
sleep(1000)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()

robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(500)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 3)*1000)


else: #ovipositing
robotbit.motor(1, -95, 0)
robotbit.motor(3, -90, 0)

sleep(random.randint(0, 10)*100)

np[3] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[2] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[1] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[0] = (0, 0, 0)
np.show()
# sleep(random.randint(0, 10)*100)

References:

Lilienthal, A. and Duckett, T. (2004). Experimental analysis of gas-sensitive Braitenberg vehicles. Advanced Robotics, 18(8), pp.817-834.

Nielsen, R. and Brister, D. (1977). The Greater Wax Moth:1 Adult Behavior2. Annals of the Entomological Society of America, 70(1), pp.101-103.

Leave a Reply