Final Project: Documentation

By Gabrielle Branche

Synopsis:

My final project was to create a fully functioning moth that holistically worked.  My report explains the process and has the code for my final. See report here. Photos and Videos of the moth at different stages of its development can be found here.

Reflection:

This final project for me served as a way to tie together all the aspects of Bio-inspired Robotics that was learnt in this class. Firstly I wanted to perfect the behavior of my moth as we addressed that intelligence is a large factor in making a machine a robot. Although my robot did not have artificial intelligence, I hoped that my robot could respond to its environment with the use of hard code. 

Secondly I looked at locomotion. While my robot still had wheels I aimed to explore its true movement bu looking at the flapping of its wings. I also tried to tie in what I learned in my other class by lazer cutting and manually building the wings. This way my robot could be more authentic and go beyond behavior

Finally had time allowed I would have used to webcam to have object recognition but this is now an extension and improvement should I decide to develop this bot further. 

This has been a great learning experience for me, starting with questioning what makes a robot a robot to the details of it such as using functions to polish and increase the robustness of my code. Most importantly I learned that to truly get into bio-inspired robotics, observation research and prototyping and the key steps to building something that has the potential to be great. 

Final Project Proposal

By: Gabrielle Branche

Introduction:

For my midterm project I looked at the behavioral characteristics of a moth. The final result was a robot that could move around, deposit ‘eggs’ (represented through neon lights) and fertilize and specific mating sites. In the second half of the semester we have spent a great deal of time working on locomotion and looking more closely on the mechanics of movement. This got me thinking into the design of my moth. As such I want to use my final project to perfect my moth both in a behavioral point of view and also implement locomotive characteristics.

The moth was supposed to stay in the confines of it’s ‘hive’ (see midterm documentation) but did not work with the other code. This would be the first aspect of my moth that I would like to work such that in alignment with the giant [insert name of moth here… the moth would holistically behave accurately.

After that I would like to design wings that can flap similar to that of a moth that can be attacked to the kitten bot. Of course, determining the aerodynamics to actually make the robot fly would be beyond the scope of this course. However, I believe that it is within my capacity to make wings that flap and pause depending on when the bot is moving or not.

Observation:

My first step was to find videos that could show me exactly how the moth wings work. The best two videos which show their movement in slow motion are as shown below:

From observing these videos, I realized that moths have a very even up and down movement almost as if both wings are controlled by a single axis. This would be important to take into consideration when creating my own wings

Research:

After researching the aerodynamics of moths, I learnt that the wings need to have some level of flexibility to allow for a fluid movement of flapping (Smith, 1996 ). Additionally, in a study of Hawkmoth flight, it was indicated that the angle of rotation of the wings were not constant and changes with speed and flight distance (Willmont and Ellington 1997). Finally, my suspicions about the consistency of both sides of wings were also confirmed in this study which should that both sides were coupled very evenly (Willmont and Ellington, 1997).

Modelling:

To model these wings, I will use the skills we learnt about prototyping to create different versions. My first version will most likely use cardboard to fully understand the flapping mechanism. I hope to move from this point to a 3D printer version that will allow for a more flexible material which would satisfy the aerodynamic needs.

I found a video that I can use to model the wings that use simple servo mechanics on a single axis to simulate flapping.

Timeline and challenges:

I have currently done the research needed to implement this project. The next step would be to start he design. My hope is to have my first prototype by Tuesday’s class which I can then perfect for Thursday’s class giving me the weekend to make the final result and focus on the paper.

The biggest challenge for this project would be the coding as I am still not the strongest coder. However, I believe that with the use of the video showing how the wings were made and with good timing for improvement I can get this project fully put together

Benefit and Importance:

This project will improve my understanding of code and mechanics. Moreover, I believe that the beauty of robots are the fact that like biological systems they exist with a level of robustness that leaves room for adaptability. My hope is that by choosing to continue with the moth I can finish the semester with a complete robot that can stand on its own and that I am proud of.

References:

Gopalakrishnan, Pradeep, and Danesh K. Tafti. “Effect Of Wing Flexibility On Lift And Thrust Production In Flapping Flight”. AIAA Journal, vol 48, no. 5, 2010, pp. 865-877. American Institute Of Aeronautics And Astronautics (AIAA), doi:10.2514/1.39957.

Smith, Michael J. C. “Simulating Moth Wing Aerodynamics – Towards The Development Of Flapping-Wing Technology”. AIAA Journal, vol 34, no. 7, 1996, pp. 1348-1355. American Institute Of Aeronautics And Astronautics (AIAA), doi:10.2514/3.13239. Accessed 5 May 2019.

WILLMOTT, ALEXANDER P., and CHARLES P. ELLINGTON. “THE MECHANICS OF FLIGHT IN THE HAWKMOTH MANDUCA SEXTA”. The Journal Of Experimental Biology, vol 200, 1997, pp. 2705–2722., Accessed 5 May 2019.

Midterm Project: You Got Any Light

By: Gabrielle Branche

Synopsis:

For my midterm project I decided to explore the behaviour of a moth. After doing research I realised that the Greater Wax Moth had very distinct behavior. The females after mating stayed in the hive and oviposited in the hive. However as with most moths, the moths while not drawn to light are attracted once exposed to bright amounts of light.  As such I tried to simulate these behaviours by using a light sensor, an IR sensor, an ultrasound sensor and neopixel lights.

Inspiration:

I initially looked at Braitenberg’s creatures and was interested in Paranoid. I liked the way it got trapped in a loop of wanted to explore but then getting scared. I was facinated about seeing what patterns it would make if it were placed in a spotlight and had to make it’s way back to the light from the darkness. However once actually executing this I noticed that it actually just circled around the light. When discussing with classmates, we thought about how the fact that it was drawn to the light was similar to that of a moth. From there the process of making a moth developed.

The Process:

The first code I worked with was the light sensor. I programmed the kittenbot to circle a bright light source by making it circle below a certain light threshold. (See #bright)  It is not drawn to the light but rather gets trapped in the bright light source since the sensor is place directly in the centre of the bot. Once the light source is removed, it continues on it’s way.

After I worked with ultrasound to have the bot responsive to objects. At first I wanted the objects to represent food sources, however according to Nielson, the greater wax moth mates at least once a day (Nielson, 1977). As such I decided to have the objects represent mating locations (cites of you male moths). Similar to the photo sensor, the code runs only below a certain threshold when the ultrasound is within close range of an object. 

The ultrasound was at first difficult to work with because the range was very difficult to manipulate. I realised that part of the reason was the fact that when nothing is in range the ultrasound reads 0. However after changing the code to accommodate this (see #ultrasound) it worked better. Nevertheless it is still prone to stopping randomly and carrying out the Ultrasound module of the code. The code would need to be made more robust to avoid these outliers.

I used the neopixel to represent fertilization. After mating, the bot would produce eggs – as shown by the pink lights at the back of the bot. Once the moth is fully fertilised, the lights turn green signifying that it can now move about the hive. While moving about it looses eggs due to ovipositing. As such, as the bot moves around the lights drop at a random rate. Only when it has finished ovipositing will it detect other mating areas else it would not detect them.

To develop this project further I can distinguish between objects such that it can avoid all objects unless ovipositing is completed in which case it will stop at stations. However that would require more detailed code and potentially machine learning to execute.

Finally, the code for the barrier. Since fertilized moths remain the hive while male and virgin moths go out in the early morning and at night, my intention was to have to bot be constrained within a box (the hive). The code works by itself (see video 3). However once put into the full code it no longer runs. This may be due to the fact that the neopixel has sleep functions (see #ovipositing) which was preventing the the IR sensor from being read. Still this should not be the case since the ultrasound and light are not affected and I programmed ovipositing to occur very quickly to limit sleep time. I hope to debug this in the future.

Reflection:

This project was enjoyable to work on and I believe I have learnt a great deal. There were many moments when I was very frustrated but holistically it helped me grow as a coder and better understand animalistic behavior. I believe that it has the potential to become a robust robot that can actually simulate a moth in it’s entirety, although flight bay be a challenge. The next step after debugging would be increasing it’s awareness of it’s environment and artistically speaking giving the robot a more moth-like appearance.

Videos:

The videos are too large, see videos here

Code:

Below is the final code combined:

from microbit import *
import robotbit
import time
import random
import neopixel

# Setup the Neopixel strip on pin0 with a length of 8 pixels
np = neopixel.NeoPixel(pin16, 4)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()
sleep(random.randint(0, 3)*1000)


while True:
border = pin1.read_analog() #determine light frequency
dist = robotbit.sonar(pin0) #determine object distance
light = pin2.read_analog() #determine light intensity

#light sensor
if light < 50: #bright
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)

#IR Sensor
elif border < 200: #black
robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(2000)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 5)*1000)

#ultrasound sensor
elif dist < 50 and dist != 0:

robotbit.motor(1, 0, 0)
robotbit.motor(3, 0, 0)

np[0] = (0, 0, 0)
np[1] = (0, 0, 0)
np[2] = (0, 0, 0)
np[3] = (0, 0, 0)
np.show()
sleep(500)

np[0] = (255, 0, 128)
np.show()
sleep(1000)

np[1] = (255, 0, 128)
np.show()
sleep(1000)

np[2] = (255, 0, 128)
np.show()
sleep(1000)

np[3] = (255, 0, 128)
np.show()
sleep(1000)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()

robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(500)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 3)*1000)


else: #ovipositing
robotbit.motor(1, -95, 0)
robotbit.motor(3, -90, 0)

sleep(random.randint(0, 10)*100)

np[3] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[2] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[1] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[0] = (0, 0, 0)
np.show()
# sleep(random.randint(0, 10)*100)

References:

Lilienthal, A. and Duckett, T. (2004). Experimental analysis of gas-sensitive Braitenberg vehicles. Advanced Robotics, 18(8), pp.817-834.

Nielsen, R. and Brister, D. (1977). The Greater Wax Moth:1 Adult Behavior2. Annals of the Entomological Society of America, 70(1), pp.101-103.

Midterm Proposal

By: Gabrielle Branche

For my midterm I plan to explore the behavior of a moth using Braitenberg’s creature paranoid. My moth should be able to follow light and then circle around the light source. Once I perfect this I will try to expand my project such that my moth can find food. This will be simulated by my moth detecting ‘flowers’ staying by them and then turning away from them, unless there is a light source in which case the original code will take priority