MLNI midterm Project (Shenshen Lei)

Shenshen Lei (sl6899)

For the midterm project, I intended to create a museum viewer imitating 3d effect. After researching on the current online museum, I have found that there are mostly two types of online museum viewing interaction: the first is by clicking the mouse, which is discontinuous, and the second is using the gravity sensor on the mobile phone, which asks the user to turn around constantly.

In my project, I want the user to have an immersive and continuous experience of viewing online museum without busy clicking and drugging the screen. To mimic an immersive experience, I used Postnet (the Posenet reacts faster than Bodypix) to track the position of the user’s eyes and the nose. These three-point form a triangle that can move when facing different directions. The coordination of the background picture move following the position of the triangle. In this process, one thing bothered me is that the user’s movement caught by the camera is mirror side, so I have to change the x-coordinate by using the width of canvas to subtract detected x. I calculated the coordinate of the centroid of the face triangle so I could use one pair of variables rather than six.

To show the position of detected face, I drew a circle without filing color. Also, I added a function that when the circle of gaze moved on to the staff, the name of the thing will display in the circle. (While I  initially believed that the circle can imply the users of the viewing process, but it seems confusing sometime.)

So my current project works as the following gif:

After presented my project to the class and guests, I have received many valuable suggestions. For the future improvement of my project, I will make some changes.


Firstly, with the suggestion from the professor, I will use the ratio of sides of the triangle rather than calculating the centroid, because using the vector as the variable will save the moving distance of the user. And that will also smooth the moving of the background picture. It may also improve the immersive experience from changing coordinate to changing viewing angles, as suggested by my classmates. Another thing is that I will add a zoom-in function that when the user moves closer to that camera, the picture will turn bigger. The method to solve it is to measure the distance of the user’s two eyes detected by a computer camera. Finally, for the instruction part, I will add a signal before the item introductions jump out. For example, when the camera detects the user’s hand and then will show the introduction in a few seconds.. I was inspired by Brandon’s projects for employing the voice commands, but the accuracy of voice command should be tested. There will be more thoughts on improving the user experience of my project. I am considering to keep on working on the project and show a more complete version in the final project.


Thanks to all the teachers and assistances and classmates who helped me or gave out advice on my project.

Midterm Project “Plantalks”-Lifan Yu-Inmi Lee

Midterm Project Individual Reflection

“Plantalks”

By Lifan  Yu (ly1164)

Instructor: Inmi Lee

     

  1. CONTEXT AND SIGNIFICANCE :

What is our “Plantalks”

       We got our idea from my partner Xinran’s plants. She sometimes forgets to water her plants in her dorms. So we thought of creating a device that helps remind her to water her plants.

       Our “Plantalks” is a device that facilitates the communication between humans and plants. It reminds people of when to water plants and indicate how much water those plants need. It helps plants to express themselves by detecting the wetness of their soil and making movements and sounds to help them “express” themselves (when they are in need of watering, when the amount of water is enough and when the soil is too wet.

When people stand within a distance to the device, it will start working. When soil is too dry, the device will carry plants to move up and down as if it was anxiously waiting for someone to water it. Meanwhile, it will play a piece of audio saying “I’m so dry, I need help, I need water, help!” and blink a red light. When wetness is just fine, the device will play a song saying “It’s unbelievable, this is as good as it gets.” And blink a green light. When the soil is too wet, it will play a piece of audio saying “I’m drowning, no more water please!” and blink a yellow light. A piece of hand-shaped colored paper will start waving as if saying “No”.

This way, the device helps plants communicate effectively with people. People can know when to water the plant and when to stop watering. This device makes possible a long-term interaction possible.

How my group project inspired me

       My previous group project was Heal-o-matic 5000, a medical device that can diagnose people’s illnesses. People put their hands on a specially designed sensor screen and their faces are scanned for their ID. The device will then diagnose people’s illnesses by analyzing the data collected by the sensors and the diagnosis will be appeared on the screen. Meanwhile the relevant medicine will be dispensed.

       This device mainly responds to the environment. The interaction level is relatively low. The person isn’t continuously interacting with the device. The person only sends out information for a small interval of time before the machine responds and this round of communication ends.
       I wondered if I could build a device that can constantly communicate with the other “actor”. That is, it can detect the information of the other “actor” constantly. The other actor can also respond to the device’s movements, sounds etc. I also hope that this type of communication can last for a long time.

The project I have researched & what inspired us

       Reactive Sparks by Markus Lerner. This is an installation of seven double-sided vertical screens that is currently in front of the OSRAM main office in Munich, Germany. When cars pass the road in front of the screens, light-colored lines will appear on the screen. Meanwhile, the orange-color waves on the screen rise when the numbers of passing cars increase.

       This device displays the movements, speed and numbers of the passing cars. Through the changing lights people can see from a distance the traffic conditions on the road. It collected massive analog information and converted them into a simplified, more visualized and artistic form.

       When I saw this project, I wondered if I could create something like that. But I didn’t immediately draw inspirations from this when designing the midterm project. Xinran Fan and I initially thought of creating a device that automatically waters plants according to the wetness of their soil. After consulting with instructors (instructors Inmi Lee and Marcela), we learned that as plants don’t talk, we can create a device that makes movements and sounds to help plants “express” itself so as to remind people of watering them.

       This project we researched is actually a reactive device. However, this type of reaction fits well in our new idea. Showing something that can’t be easily noticed or attached importance on in an exaggerated or artistic style is actually what we need in helping plants to “talk”.

What is unique and/or significant about our project

       Our project involves not only humans but also plants. It facilitate the communications between humans and plants. It presents the conditions of plants in an artistic and attractive way. People are not only interacting with the fun device itself because more importantly, the device made possible the interaction between people and their plants. Combining plants with and interactive device is the most unique aspect of our project.

special value to its targeted audience

       While it helps people remember watering their plants in order to keep the plants alive, it also enriches people’s experience of watering their plants-make this a fun thing.

  1. FABRICATION AND PRODUCTION:

What our first circuit looked like: 

Significant steps

3D printing

       Servo motors are only able to turn from 0° to 180°. We need the part of our device that carries plants to move up and down. To convert the form of motion we need a wheel gear attached to our servo motor and a part that can be moved up and down when attached to the wheel gear.

       The parts seem simple but we failed several times when printing. The first time the printer stopped generating heated plastic material:

The second time the printing process went wrong when printing the support parts. We finally succeeded the third time.

       The original flower pot was too heavy for our device to carry. My partner Xinran Fan made a 3D model of a smaller flower pot. The first flower pot we printed broke soon after we took it off the printer. It seems the layers of material don’t stick with each other quite well.

We printed again setting the refill to 99%. However, soon after the printing began, the model couldn’t stick to the platform and started to move. We used another printer instead. This time I added raft. Out model was finally printed successfully.

Coding

       We used If, Then, Else, Else if to blink the lights. However, the red light indicating “too dry” didn’t stop blinking when the green light turn on. After testing repeatedly we finally figured out that in our code, we should also tell the other lights to stop blinking when one light is shining.

     

       When adding the audio pieces, we found that a longer audio piece can’t be played until the end. It always stops playing after a few seconds. It took us a lot of effort to recall the function of “delay” and change the value of “delay”.

       We thought of including the “If Then Else” statements of the lighting and motioning part in another “If Then Else” structure based on distance statistics collected by the distance sensor. But instructors advised us to use && to combine two conditions. Such as “if (distance < 30 && sensorValue > 800)”.

       It was so hard to choose the intervals for “too dry”, “just fine” and “too wet”. We tested countless times to determine the suitable numbers. When the interval for “just fine” was too small, the device will go from shining a red light to shining a yellow light when we gradually pour water into the soil. After an arduous process we finally found the relatively suitable intervals.

Adding audio files

       We added three recorded audio files into our project. These audio files will be played while the LEDs blink. We learned to convert mp3. format to wav. format. (In the end, however, we still used mp3. format) We tried several types of speakers before finding ones that are easy to work with and figuring out the code with the help of instructors.

      

User testing

       At user testing we haven’t added the infrared distance sensor. An instructor advised us to add an infrared sensor so that when people are far away from the plant, the device will not be working. That is, the plants wouldn’t be calling for people “in vain”.

       We didn’t put descriptions near our LED lights to describe what the blinking of each light means. Several people suggested us to add attractive descriptions, so we added a sad face, a smiley face and a surprised face with descriptions “too dry”, “just fine” and “too much water”.

       These adaptions were effective because they helped users make sense of what exactly the plant is “conveying”.

What our device looked like without description:

  1. CONCLUSIONS

My goal is creating an interactive device that facilitates the communication between humans and plants. This device is interacting with plants as well as people.

My definition of interaction: Interaction is the process in which two people, devices, or a people and a device communicating with one another. This type of communication should include receiving information, processing the received information and giving a feedback, response or creating a certain kind of action according to the analytical results made in the processing stage. This idea is shaped by Crawford’s words “interaction is a cyclic process in which two actors alternately listen, think, and speak” .

Also, to distinguish interaction from reaction, both of the two interacting units should go through these three processes mentioned. They are both actively engaged in a rather continuous communication. Two-way feedback is indispensable.

In my project, the device, combined with plants can constantly send out their own information to people. People will see the information and react by watering the plants. This kind of long-term, back and forth information exchanging align with my definition of interaction.

During presentation, my audience interacted with our device by stepping forward to the device, seeing its reactions and slowly pouring water into the plant on our device. The different states of the plant are shown in a visual and auditory way. My audiences also put the moisture sensor into other plants provided on our table and saw the different information different plants “wish to convey”.

Improvements we could make if we had more time

An instructor advised us to change the simple, removable moisture sensor into something more complexed and that constantly stays in the soil. If the plant is to be put inside a room, the infrared distance sensor can be put at some other place where people can definitely walk by because people may not always walk up to the plants.

What we’ve learned

We explored the possibilities of facilitating interaction between plants, device and people, which is quite an innovative thing.

We kept facing difficulties in coding and 3D printing. We learned to figure out solutions step by step. When things don’t work out, we just patiently restart. Reflecting on our failures and keep trying is a right approach to success.

BIRS Midterm Documentation

Assignment Overview

The goal of the midterm project was for us to create robotic implementations of a living organism. We had to choose an animal behavior and program our Kittenbots to mimic that behavior.

Animal Behavior

In my research, I looked at a variety of different insects because I was fascinated by their movement patterns, from ant farms to synchronous fireflies. I finally decided to model my project after termite behavior, specifically the termite ink experiment. I found it fascinating that termites communicate solely by smelling and secreting pheromones, which can be manipulated by drawing a line of ink on a surface.

Programming Process

My plan of action was to make my Kittenbot wiggle along a line, which is a seemingly simple task, but I did experience challenges along the way. I chose a simpler behavioral concept because my goal was to make sure that I have a working project, and also become comfortable with programming the Microbit, and Robotbit. 

During my first trials, my robot would not recognize the line correctly and spin around in circles instead. It also moved backwards. This confused me because I had downloaded the code online.

https://youtu.be/W_mZL39oZ88

However, I then took the time to tinker with the code on the MakeCode editor, and test it out with each change I made. This really helped me become more comfortable with programming, and understand how each component worked together. I also had to consider the track I was creating for the robot. I build my line track out of electrical tape. My first iterations had problems because I did not build a proper track. I had to make sure that the track was wide, with obtuse angles at the turning points. I also had to ensure that the entire track was on one surface, with an adequate contrast between the tape and the surface.

Final Product

Below is the finished project:

https://youtu.be/u5Bpz235uis

Midterm Project: You Got Any Light

By: Gabrielle Branche

Synopsis:

For my midterm project I decided to explore the behaviour of a moth. After doing research I realised that the Greater Wax Moth had very distinct behavior. The females after mating stayed in the hive and oviposited in the hive. However as with most moths, the moths while not drawn to light are attracted once exposed to bright amounts of light.  As such I tried to simulate these behaviours by using a light sensor, an IR sensor, an ultrasound sensor and neopixel lights.

Inspiration:

I initially looked at Braitenberg’s creatures and was interested in Paranoid. I liked the way it got trapped in a loop of wanted to explore but then getting scared. I was facinated about seeing what patterns it would make if it were placed in a spotlight and had to make it’s way back to the light from the darkness. However once actually executing this I noticed that it actually just circled around the light. When discussing with classmates, we thought about how the fact that it was drawn to the light was similar to that of a moth. From there the process of making a moth developed.

The Process:

The first code I worked with was the light sensor. I programmed the kittenbot to circle a bright light source by making it circle below a certain light threshold. (See #bright)  It is not drawn to the light but rather gets trapped in the bright light source since the sensor is place directly in the centre of the bot. Once the light source is removed, it continues on it’s way.

After I worked with ultrasound to have the bot responsive to objects. At first I wanted the objects to represent food sources, however according to Nielson, the greater wax moth mates at least once a day (Nielson, 1977). As such I decided to have the objects represent mating locations (cites of you male moths). Similar to the photo sensor, the code runs only below a certain threshold when the ultrasound is within close range of an object. 

The ultrasound was at first difficult to work with because the range was very difficult to manipulate. I realised that part of the reason was the fact that when nothing is in range the ultrasound reads 0. However after changing the code to accommodate this (see #ultrasound) it worked better. Nevertheless it is still prone to stopping randomly and carrying out the Ultrasound module of the code. The code would need to be made more robust to avoid these outliers.

I used the neopixel to represent fertilization. After mating, the bot would produce eggs – as shown by the pink lights at the back of the bot. Once the moth is fully fertilised, the lights turn green signifying that it can now move about the hive. While moving about it looses eggs due to ovipositing. As such, as the bot moves around the lights drop at a random rate. Only when it has finished ovipositing will it detect other mating areas else it would not detect them.

To develop this project further I can distinguish between objects such that it can avoid all objects unless ovipositing is completed in which case it will stop at stations. However that would require more detailed code and potentially machine learning to execute.

Finally, the code for the barrier. Since fertilized moths remain the hive while male and virgin moths go out in the early morning and at night, my intention was to have to bot be constrained within a box (the hive). The code works by itself (see video 3). However once put into the full code it no longer runs. This may be due to the fact that the neopixel has sleep functions (see #ovipositing) which was preventing the the IR sensor from being read. Still this should not be the case since the ultrasound and light are not affected and I programmed ovipositing to occur very quickly to limit sleep time. I hope to debug this in the future.

Reflection:

This project was enjoyable to work on and I believe I have learnt a great deal. There were many moments when I was very frustrated but holistically it helped me grow as a coder and better understand animalistic behavior. I believe that it has the potential to become a robust robot that can actually simulate a moth in it’s entirety, although flight bay be a challenge. The next step after debugging would be increasing it’s awareness of it’s environment and artistically speaking giving the robot a more moth-like appearance.

Videos:

The videos are too large, see videos here

Code:

Below is the final code combined:

from microbit import *
import robotbit
import time
import random
import neopixel

# Setup the Neopixel strip on pin0 with a length of 8 pixels
np = neopixel.NeoPixel(pin16, 4)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()
sleep(random.randint(0, 3)*1000)


while True:
border = pin1.read_analog() #determine light frequency
dist = robotbit.sonar(pin0) #determine object distance
light = pin2.read_analog() #determine light intensity

#light sensor
if light < 50: #bright
#display.show(Image.HAPPY)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)

#IR Sensor
elif border < 200: #black
robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(2000)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 5)*1000)

#ultrasound sensor
elif dist < 50 and dist != 0:

robotbit.motor(1, 0, 0)
robotbit.motor(3, 0, 0)

np[0] = (0, 0, 0)
np[1] = (0, 0, 0)
np[2] = (0, 0, 0)
np[3] = (0, 0, 0)
np.show()
sleep(500)

np[0] = (255, 0, 128)
np.show()
sleep(1000)

np[1] = (255, 0, 128)
np.show()
sleep(1000)

np[2] = (255, 0, 128)
np.show()
sleep(1000)

np[3] = (255, 0, 128)
np.show()
sleep(1000)

np[0] = (0, 255, 0)
np[1] = (0, 255, 0)
np[2] = (0, 255, 0)
np[3] = (0, 255, 0)
np.show()

robotbit.motor(1, 100, 0)
robotbit.motor(3, 100, 0)
sleep(500)
robotbit.motor(1, -105, 0)
robotbit.motor(3, 10, 0)
sleep(random.randint(0, 3)*1000)


else: #ovipositing
robotbit.motor(1, -95, 0)
robotbit.motor(3, -90, 0)

sleep(random.randint(0, 10)*100)

np[3] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[2] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[1] = (0, 0, 0)
np.show()
sleep(random.randint(0, 20)*100)

np[0] = (0, 0, 0)
np.show()
# sleep(random.randint(0, 10)*100)

References:

Lilienthal, A. and Duckett, T. (2004). Experimental analysis of gas-sensitive Braitenberg vehicles. Advanced Robotics, 18(8), pp.817-834.

Nielsen, R. and Brister, D. (1977). The Greater Wax Moth:1 Adult Behavior2. Annals of the Entomological Society of America, 70(1), pp.101-103.

Midterm Proposal

By: Gabrielle Branche

For my midterm I plan to explore the behavior of a moth using Braitenberg’s creature paranoid. My moth should be able to follow light and then circle around the light source. Once I perfect this I will try to expand my project such that my moth can find food. This will be simulated by my moth detecting ‘flowers’ staying by them and then turning away from them, unless there is a light source in which case the original code will take priority