Lab 8 & 9: Locomotion and Prototyping

By: Gabrielle Branche

Introduction:

For the first half of the semester we looked at behavioral characteristics of organisms and how they could be applied to machines. The next step was to look at how these robots could not only mimic behavioral traits but also physical traits of living organisms. Thus, this lab aimed to explore locomotion by creating a prototype of animal movement.

Our prototype looked at the movement of a caterpillar/inch worm. By using servos we were able to create a caterpillar that moved slowly across the floor dragging its body behind him.

Idea Development:

Our first choice of movement was that of an armadillo. We wanted to use motors and cylinders to simulate the rolling of the armadillo and if successful, further develop it to move its head in and out of its shell/ However after some discussion and some preliminary testing of the mechanics we realized that in the time given we would not be able to carry this through. Moreover, the mechanics were very similar to that of a wheel which defeated the purpose of the lab.

Next, we looked at the movement of a monkey swinging from vines but quickly came across problems of hooking which made us switch again. Finally, we settled on a caterpillar. First we created a purely physical set up using cardboard, string and pipe cleaner. We then moved to a single servo with card board and string. The final project used two servos cardboard and pipe cleaner

Prototype development

Prototype 1: String mechanic.

The string mechanic relied on the springiness of the pipe cleaner. By inserting a string and a coil between two pieces of card board, they could be used to bring the back panel closer to the front panel. Then due to the tension in the coil, the front panel is pushed forward causing and forward movement.

Prototype 1

The challenge of this mechanism was the fact that the string needed to be pulled manually. Additionally, due to the lack of weight in the card board panels, instead of having the push and pull movement, the whole structure slid when the string was pulled.

Prototype 2: The servo mechanic

This idea was adopted after finding a YouTube video that achieved this movement.

As shown in the video, a servo is placed on the piece or bent card board. Just like that of the coil, the tension created when the two sides are compressed results in an opposite reaction which propels the structure forward. Instead of using a paper clip we used string which later was changed to pipe cleaner to improve the strength when moving.

This structure worked well for our purposes but the movement was very slow. We tried different methods to increase friction on the cardboard in an attempt to reduce the slipping and thus increase the distance moved with each rotation. First, we put hot glue on the edge of the card board. When that didn’t work we tried using sand on a straw attached to the bottom. However, that too proved futile. Both trials resulted in the structure not moving forward at all. We realized that the cardboard worked best just as it was and stuck to it. However, we thought that to truly show the movement of a caterpillar we should put more than one of the structures together. This led to the third and final prototype.

Prototype 3: Final Product

The final product was a combination of two secondary prototype structures. The biggest challenge when putting these two structures together was to balance the weights and time correctly the movement of each part such that the caterpillar would move forward and not be overwhelmed by the force of any one part of the hybrid structure. We used batteries to balance the weight of each part of the structure. Through trial and error we were able to find the best location to place the batteries such that the caterpillar moved. Additionally, by offsetting the second structure by 0.5s, the structure had a more authentic structure of each part moving to catch up with the one before it.

Finally, after developing the final prototype we looked up different caterpillars and used fabric to make our prototype resemble an actual caterpillar. Since we were limited by the fabrics available in the lab we tried to find caterpillars that were red/black to ensure as much accuracy as possible. We settled on our final design inspired by the type of caterpillar shown below.

Reflection:

This lab was very beneficial as it improved my understanding of the prototyping process. With each step we were able to build upon the success of the product which was created before. Prototyping allows for the creator to concretely identify the challenges of their invention. It allows for a more robust product. It also allows for a more stepwise process as the preliminary tests are more to understand and develop the mechanics of the robot which can then be made aesthetic and showcase worthy.

At first I thought that trying to simulate movement without wheels would be difficult particularly as my mind was being limited to solely organisms that walk or fly. However after finishing our project and observing the presentations of my peers I realized that there are actually a plethora of ways in which organisms move. This truly put into perspective the importance of locomotion.

Looking ahead:

If this project were to be developed the next steps would be to create more links in the worm and perhaps finding a more rigid method of balancing the weights on each substructure which is not just trial and error. After that, finding a way to make the structure smaller such that it can be close in size to an actual caterpillar would help with the overall design of the robot. Finally improving the aesthetic of the caterpillar would give the final touches on the overall structure.

Reflection 9: Swarm Intelligence

By: Gabrielle Branche

Synopsis: 

This week’s readings serve to introduce us to swarm intelligence by commenting on swarm activity in nature. The first article ‘When Ants Get Together to Make a Decision’ discusses ants and how they choose new locations to move the colony. The second article, an extract form the book Swarm Intelligence provides the introduction for swarm intelligence and how it can be used in the field of architecture. 

The whole is greater than the sum of its parts 

Definitions:

I didn’t fully understand the concept of multi-agent models 

Wikipedia definition: A multi-agent system (MAS or “self-organized system”) is a computerized system composed of multiple interacting intelligent agents. Multi-agent systems can solve problems that are difficult or impossible for an individual agent or a monolithic system to solve. Intelligence may include methodicfunctionalprocedural approaches, algorithmic search or reinforcement learning.

Thoughts:

I found it interesting how the articles discuss how species such as birds and ants distribute a task among colony members. That way each individual organism has only simple decisions to make. This point was brought up in both articles and seemed to be the key to understanding swarm intelligence. 

As with humans, these organisms can succumb to cognitive overload. However these animals have solved this issue by leaning more to collectivism. Just as there is no lead ant, there is no lead bird. They all depend on one another and are in-tune with only a few other individuals but when looked at holistically they all become interconnected allowing for collective behavior.

Midterm Project: You Got Any Light

By: Gabrielle Branche

Synopsis:

For my midterm project I decided to explore the behaviour of a moth. After doing research I realised that the Greater Wax Moth had very distinct behavior. The females after mating stayed in the hive and oviposited in the hive. However as with most moths, the moths while not drawn to light are attracted once exposed to bright amounts of light.  As such I tried to simulate these behaviours by using a light sensor, an IR sensor, an ultrasound sensor and neopixel lights.

Inspiration:

I initially looked at Braitenberg’s creatures and was interested in Paranoid. I liked the way it got trapped in a loop of wanted to explore but then getting scared. I was facinated about seeing what patterns it would make if it were placed in a spotlight and had to make it’s way back to the light from the darkness. However once actually executing this I noticed that it actually just circled around the light. When discussing with classmates, we thought about how the fact that it was drawn to the light was similar to that of a moth. From there the process of making a moth developed.

The Process:

The first code I worked with was the light sensor. I programmed the kittenbot to circle a bright light source by making it circle below a certain light threshold. (See #bright)  It is not drawn to the light but rather gets trapped in the bright light source since the sensor is place directly in the centre of the bot. Once the light source is removed, it continues on it’s way.

After I worked with ultrasound to have the bot responsive to objects. At first I wanted the objects to represent food sources, however according to Nielson, the greater wax moth mates at least once a day (Nielson, 1977). As such I decided to have the objects represent mating locations (cites of you male moths). Similar to the photo sensor, the code runs only below a certain threshold when the ultrasound is within close range of an object. 

The ultrasound was at first difficult to work with because the range was very difficult to manipulate. I realised that part of the reason was the fact that when nothing is in range the ultrasound reads 0. However after changing the code to accommodate this (see #ultrasound) it worked better. Nevertheless it is still prone to stopping randomly and carrying out the Ultrasound module of the code. The code would need to be made more robust to avoid these outliers.

I used the neopixel to represent fertilization. After mating, the bot would produce eggs – as shown by the pink lights at the back of the bot. Once the moth is fully fertilised, the lights turn green signifying that it can now move about the hive. While moving about it looses eggs due to ovipositing. As such, as the bot moves around the lights drop at a random rate. Only when it has finished ovipositing will it detect other mating areas else it would not detect them.

To develop this project further I can distinguish between objects such that it can avoid all objects unless ovipositing is completed in which case it will stop at stations. However that would require more detailed code and potentially machine learning to execute.

Finally, the code for the barrier. Since fertilized moths remain the hive while male and virgin moths go out in the early morning and at night, my intention was to have to bot be constrained within a box (the hive). The code works by itself (see video 3). However once put into the full code it no longer runs. This may be due to the fact that the neopixel has sleep functions (see #ovipositing) which was preventing the the IR sensor from being read. Still this should not be the case since the ultrasound and light are not affected and I programmed ovipositing to occur very quickly to limit sleep time. I hope to debug this in the future.

Reflection:

This project was enjoyable to work on and I believe I have learnt a great deal. There were many moments when I was very frustrated but holistically it helped me grow as a coder and better understand animalistic behavior. I believe that it has the potential to become a robust robot that can actually simulate a moth in it’s entirety, although flight bay be a challenge. The next step after debugging would be increasing it’s awareness of it’s environment and artistically speaking giving the robot a more moth-like appearance.

Videos:

The videos are too large, see videos here

Code:

Below is the final code combined:

from microbit import *
import robotbit
import time
import random
import neopixel

# Setup the Neopixel strip on pin0 with a length of 8 pixels
np = neopixel.NeoPixel(pin16, 4)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()
sleep(random.randint(0, 3)*1000)


while True:
border = pin1.read_analog() #determine light frequency
dist = robotbit.sonar(pin0) #determine object distance
light = pin2.read_analog() #determine light intensity

#light sensor
if light < 50: #bright
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)

#IR Sensor
elif border < 200: #black
robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(2000)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 5)*1000)

#ultrasound sensor
elif dist < 50 and dist != 0:

robotbit.motor(1, 0, 0)
robotbit.motor(3, 0, 0)

np[0] = (0, 0, 0)
np[1] = (0, 0, 0)
np[2] = (0, 0, 0)
np[3] = (0, 0, 0)
np.show()
sleep(500)

np[0] = (255, 0, 128)
np.show()
sleep(1000)

np[1] = (255, 0, 128)
np.show()
sleep(1000)

np[2] = (255, 0, 128)
np.show()
sleep(1000)

np[3] = (255, 0, 128)
np.show()
sleep(1000)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()

robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(500)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 3)*1000)


else: #ovipositing
robotbit.motor(1, -95, 0)
robotbit.motor(3, -90, 0)

sleep(random.randint(0, 10)*100)

np[3] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[2] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[1] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[0] = (0, 0, 0)
np.show()
# sleep(random.randint(0, 10)*100)

References:

Lilienthal, A. and Duckett, T. (2004). Experimental analysis of gas-sensitive Braitenberg vehicles. Advanced Robotics, 18(8), pp.817-834.

Nielsen, R. and Brister, D. (1977). The Greater Wax Moth:1 Adult Behavior2. Annals of the Entomological Society of America, 70(1), pp.101-103.

Midterm Proposal

By: Gabrielle Branche

For my midterm I plan to explore the behavior of a moth using Braitenberg’s creature paranoid. My moth should be able to follow light and then circle around the light source. Once I perfect this I will try to expand my project such that my moth can find food. This will be simulated by my moth detecting ‘flowers’ staying by them and then turning away from them, unless there is a light source in which case the original code will take priority

Lab Report 4: Vehicles and Creatures

By Gabrielle Branche

Plan:

I chose to explore paranoid. Paranoid describes a creature who is afraid of the dark but still goes into the dark and only when it is in the dark does it decide to turn around back into the light. 

I decided to make a version of this creature because I belief fair of the dark is such an intrinsic characteristic of living organisms from babies to dogs to even some grown adults can be afraid of the dark. 

To simulate this creature my kitten-bot would need to be able to move constantly in one direction but then begin to turn with a certain level of darkness. This would require a light sensor that can measure the light intensity. The light intensity can then be used to control the motors thus simulating a paranoid creature. 

Program:

The following link shows the kittenbot responding to light:

https://drive.google.com/file/d/1TyZviqV7ZG6tz72GvqKDQLn-VxB0DOUv/view?usp=sharing

Here is the code that for video:

from microbit import *
import robotbit
import time

#if bright and no obstacle

#control motor by light
while True:
#original paranoid -run away from darkness
#to get the light intensity
dist = robotbit.sonar(pin0) #determine obstacle
light = pin2.read_analog() #determine light intensity

#controlling motor using light intensity

if light > 400: #dark
display.show(Image.SAD)
robotbit.motor(1, -105, 0)
robotbit.motor(3, -50, 0)


else: #bright and no obstacle
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, -100, 0)

At first I wanted to make the creature move both by ultrasound and light but the code would not work together. I then decided to use just light and perfect it so that it could move smoothly in a light environment. 

I realized that this was still quite beneficial because shadow is cast by objects and by avoiding the shadows the creature inadvertently avoid objects. Therefore in the lounge my creature was navigating quite well only using light. 

Draw:

The following photos depict my interpretation of the vehicles readings and show my own portrayal of my creature. 

Photo 1: A depiction of my darkness paranoid vehicles

Photo 2: A potential path for my creature if places in a sole spot of light

Photo 3: A diagram of the motor/sensor relationship for my creature

Analyse:

The following video is of a dog that is afraid of the dark. By jumping in and out of the shadows this dog simulates paranoia.

https://www.youtube.com/watch?v=VJilLGMcfkk

Remix:

The following video is of my remixed code:

https://drive.google.com/file/d/1-71MQlIOS7lggvVpKFFrtfhdjegClNVh/view?usp=sharing

After watching the video of the dog, I noticed that the dog’s paranoia went beyond just going in the dark. He would bark at the darkness before running in and out of the shadows. As such I changed the code to simulate this by having my kittenbot stop periodically before going into the dark and then turning more rapidly out of the dark. In this way I hope to better portray the hesitation of the creature more authentically. 

Below is the code for this:

from microbit import *
import robotbit
import time

#if bright and no obstacle

count = 0

#control motor by light
while True:
#original paranoid -run away from darkness
#to get the light intensity
dist = robotbit.sonar(pin0) #determine obstacle
light = pin2.read_analog() #determine light intensity

#controlling motor using light intensity

if light < 400 and light > 350 and count == 0: #see the dark
robotbit.motor(1, 0, 0)
robotbit.motor(3, 0, 0)
sleep(2000)
count = 1

elif light > 400: #dark
display.show(Image.SAD)
robotbit.motor(1, -105, 0)
robotbit.motor(3, -45, 0)


else: #bright and no obstacle
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, -100, 0)
count = 0

Reflection:

This lab was very beneficial because it enabled me to better understand the aim of bio inspired robotics and how vehicles and creatures can be used to simulate life-like behavior. I hope to improve my original idea in the future and better my creature.

Extra Resources:

http://www.diva-portal.org/smash/get/diva2:138304/FULLTEXT01.pdf

http://users.sussex.ac.uk/~christ/crs/kr-ist/lecx1a.html