“Briend” by Sharon and Jessica – Reflection by Jessica

  • CONTEXT AND SIGNIFICANCE

My previous group project “Dreamie Beanie” taught me the importance of prototypes: especially since back then we did not have to explicitly make the prototype work, managing our expectations despite having no technical limitations taught me how to be more realistic when going into the midterm project. We needed to brainstorm with four other people how to technically make the project work even though we could use our imagination to recreate the functionality — so for example despite only having to make a beanie and a cardboard TV, we still had to figure out how hypothetically the beanie would connect to the monitor, and how the beanie would connect to the user’s brain. Working with four other group members also  helped me learn how to compromise my expectations along with my partner so we could make a working project. 

The last group project taught me that interaction required repetitive feedback, so our project “Briend” or “Bird Friend” focused a lot on that: the bird was supposed to be able to respond to user interaction, not only through simple command but through repetitive action from the user. Briend was created as sort of a spin on the “emotional support animal” people have today, except it was electronic. Our project was mainly intended for kids — just someone who would get anxious without a companion and could pet it to receive some kind of validation or comfort. I feel the responses the bird could give simulates a type of comfort you would get from petting a real bird or a real dog: we wanted to make it as lifelike as possible, which is why I felt our audience would appreciate our product. 

  • CONCEPTION AND DESIGN:

In what ways did your understanding of how your users were going to interact with your project informed certain design decisions you made? Account for several such decisions. In addition, what form, material or elements did you use? What criteria did you use to select those materials? Why was it/were they best suited to your project purpose, more than any other one/s? What might have been another option? But you rejected it—why ?  

If we are being honest, our first design conception was a flower. We were going to attach a light sensor to it so that it would open and close based on bright the light was. We at first were going to use 3d printing to print the flowers, and use wires to each flower attached to a main stem so the main mechanism would push the flower open and closed. This design was based on one we found on the internet, and we decided it would be easier if we replaced the metal hinges with straws. 

Here are some of the initial drawings we made in designing the flower: 

Initial designs for the flower petals
Initial designs for the wiring and stem inside the petals that would make it move

We wanted to create a design that was simple, polygonal, and easy to 3d print, yet was sturdy enough to withstand the swinging motion needed to make the flower move. Here is the mechanism we wanted to recreate, a design we got from the initial website.

However, when our plan had to change (as I will detail in the Fabrication and Production section of the blog post) we wanted to create a bird design that was simple and resembled that of a music box (we initially wanted the bird to play music when the person came near it). We came up with the music box design firstly because it would be easy to laser cut, and secondly because we felt that music boxes gave us a lot of comfort when we were younger and wanted to recreate that feeling through a singing bird. 

We also wanted to maintain the same up down mechanism we were going to use with the flower, but this time it would be much more simple because it would only be controlling two “petals” or “wings” instead of six interlocking petals that had to open and close smoothly at the same time. Other than the box shape we wanted the bird to be as lifelike as possible, and wanted the bird to light up to bring a visual sense of happiness when the user interacts with it. We wanted to use feathers because it would make it seem more like a real bird. Here are some initial drawings and designs for both the laser cut box and the bird itself. 

Design for the size of the box
Design for how the wings would fit into the box
  • FABRICATION AND PRODUCTION:

In this section, describe and assess the most significant steps in your production process, both in terms of failures and successes. What happened during the User Testing Session? How did your user testing process influence some of your following production decisions? What kind of adaptations did you make? Were they effective? In short, keeping in mind your project goals, how do you account for and justify the various production choices you made for your project? Include sketches and drawings.

Our biggest failure for sure, as mentioned above, was having to completely scrap the flower idea. For two days out of the week before the midterm project we were 3d designing the flower petals, trying to make the stem using wire. The first flower petal turned out way too big and the 3d printer could not print it out. The second one however the shape wasn’t right for our hinges to attach to, as shown below. 

 The 3d printed petal (notice how it looks nothing like the designed sketch in the beginning)

And when we realized we still didn’t fully understand how the mechanism worked, we tried to recreate the mechanism using cardboard instead. It did not work and we could not even get the wires attached to the motor. 

Once the flower wouldn’t attach to the motor we fully gave up, and asked Rudi for help. Rudi, in helping us try to recreate the mechanism of the original flower, showed us a way we could easily recreate the mechanism of the flower moving up and down but in a much simpler way: by taping straws to cardboard and having the motor push the wire through the straw. 

Here is the video of our first prototype: 

The motors moving two wires through straws gave us the idea for a bird box, in which the motors moving the two straws would be the wings. Once we finally had the idea for the main mechanism, we went to design the actual box it will be in. 

We decided to have the bird box be smaller, firstly so the wings would move more fluidly secondly because it would be more convenient to carry. The box ended up being a 13x10x10 cm box, and we laser cut a bird design pictured below: 

We wanted the wings to be light, small, yet big enough so the motion of the up and down wings would be visible. We still used the wires, creating two big loops at around 5 cm in length, and glued feathers onto them. We attached the wings to the motors, and at first when the bird moved, the wires kept getting caught in each other. So then we realized we had to move the wires closer to the middle so they wouldn’t get tangled, and had to make the motor move faster. So while we wouldn’t get the range of motion we got before, the speed would make up for it. Ultimately, we finally got the bird to make an up down motion with the touch sensor motor. 

The birdbox shown was what we brought to the initial user testing. Making the bird flap its wings in time with the LED lights shining took us all week, so we did not have enough time to figure out how to make the bird sing or chirp. So during the user testing much of the feedback had to do with the interactivity: the bird was too robotic — the touch sensor was visible and all the bird would do is respond to the user’s button press and it would flap its wings. Despite us cutting holes in the box for the LED lights to be visible, the lights did not show through clearly at all, so really all it looked like is the bird would flap its wings once the user touched it. One feedback we really liked was using the feathers to cover the wings and instead having an arrow or a sign that said “Pet me!” which would make it more toy like. Another was that when you came near the bird the bird would react in some way, like chirp or flap its wings faster. We implemented those two comments into the design after user testing. 

Firstly, we used a smaller breadboard and attached the LED lights to the top of the bird instead of putting them inside. Secondly, we used toneMelody in combination with the distance sensor to create a bird chirping sound: we wanted to create a song but did not have enough notes or memory for it, so we instead decided to do a two note chirp. We also added a face to the bird so it would look a lot cuter: if you put a face on anything a human being will attribute emotions and feelings to it, so we felt it made the bird look a lot more personable. 

The final video product is shown below:

  • CONCLUSIONS:

Begin your conclusion by restating the goals of your project. How do your project results align with your definition of interaction? How do your project results not align with your definition of interaction? Based on both your definition of interaction and your expectation of your audience’s response, how, ultimately, did your audience interact with your project? How would you improve your project if you had more time? What of value have you learned from any setbacks and/or failures? What do you take away from your accomplishments?

The wrap-up of your conclusion delivers your final say on the issues you have raised previously, demonstrates the importance of your ideas, and insightfully synthesizes your thoughts about key elements of the project experience. As such, your conclusion should close by reflecting on the crucial questions “So what?” and “Why should anybody care?”

Ultimately, our goal was to make continuous interaction with the user: we didn’t want a simple computer that would input and understand commands, and we felt like we achieved that. The bird responded in a lifelike manner, and the interaction didn’t stop with a simple command such as “on” or “off.” However, we still felt like we could’ve done more with the motion of the wings and with the sound of the chirp: if we wanted to make this project more life like and more “align” with our definition of interaction being a continuous communication, we would’ve made it so the wings would move faster if it was “happier” or if it was petted more often. We also would’ve changed the tone so it sounded less robotic and would change depending on its “mood.” Like if you approached the bird from behind the bird would make a scared noise, but if you pet it it would make a happy sound. 

At first during the user testing, the audience’s interactions with the project were very robotic: they simply touched the touched sensor and the bird responded: there was no sound or no visual cues of their interaction, so they felt the interaction was a little cold. However, once we added the feathers, the lights at the top, and the sound, interaction with the project felt a lot more fluid, as users felt like it became more lifelike under the changes. This entire project really taught me how to manage my expectations: at first my partner and I wanted to make a fully functioning robotic flower, but it wasn’t until Tuesday night when we had to change our entire project that we realized that we couldn’t just simply based off of the knowledge we had. But I felt like we adapted really well to the failure and still used core aspects of our original project yet made it simpler and more unique based off our experience, and this project really taught me how to adapt from failure to make a cohesive product. 

Overall, this project taught me a lot: it taught me how to engineer a product not only on the software side, but also on the hardware side, ensuring each piece could function off of the other. It also taught me that it was important to adapt from failure and use that failure to expand creatively, and that there is never an end to any project. Even when I was so relieved when we finally made the bird flap its wings up and down, at the user testing people were still dissatisfied and had a lot of good feedback for how we can continue to improve the bird. My experiences with the midterm project taught me how to continuously improve, and for that I find this project very valuable. 

Mood Mind Match – Kyle Brueggemann – Marcela Godoy

CONTEXT AND SIGNIFICANCE

My first experience working with groups in this class was with the skit presentation where we had the idea of a beanie that could record dreams. I was slightly inspired by this idea with the development of a device that could somehow record our emotions and reveal something about our internal world through technological outputs. Throughout the previous developments in this class, I have learned that interaction is the process of communication and response between two participants, that continuously goes back and forth. My partner, Eva, and I wanted to use this definition of interaction along with the idea of expressing emotions to develop a project inspired by the premise of mood rings. We wanted to use the idea that the temperature of your skin is linked to your current mood and by using the Arduino to measure that temperature, and to then output it as a comprehensive value, we could allow a level of interactivity that promoted the interactor to interact with the Arduino and then gain a deeper level of comprehension from the Arduino’s direct output. We further developed upon the mood ring idea by creating a relationship compatibility tester, in which two different people allow their temperature to be read, and these values are then inputted into RGB LEDs, and then from the color of the LEDs, they can determine their compatibility based on how similar the colors are. We decided that by including two users in the interactivity process, there is the encouragement of interactivity not only between human and machine but also human and human. This project is intended for anyone who is interested in the psychology behind body temperature as well as anyone who seeks the entertainment factor of finding their possible compatibility with another person.

CONCEPTION AND DESIGN

 

Our project was a laser cut box containing all of the electrical components. There is one hole in the side of the box that allows for the USB cord to go to the laptop. There is also a laser etched title on the front of the box with our project name, “Mood Mind Match”. On the top cover of the box, we included two holes on opposite sides for the temperature sensors. On top of each sensor is a picture of a fingerprint. We also included three holes in the center: two for each LED, and one for our 3d printed heart reset button. On top of each LED, and covered in hot glue in order to give a unique appearance, are two ping pong balls cut in half. In the designing of all of these elements, we wanted to place the temperature sensors on opposite ends of the box in order to encourage the participants to face each other during the process of finding out their compatibility. This not only facilitates communication during the very exciting process of finding out compatibility, but it also assists in the placement of our LEDs We decided to place the LEDs parallel to each temperature sensor in order that each person could determine which LED lined up with each sensor. Most of our design elements were placed not only to make the use of this project as easy to understand as possible but also to follow a simple, clean, aesthetic that attracts people to our project. The placement of the half ping pong ball on top of each LED is the most apparent example of this strive towards a clean aesthetic. We could have chosen to leave the LEDs out in the open, but the design choice of covering them up conceals more of the mechanical components and allows for a more cohesive appearance of our project. We had a couple of original ideas in the design of our box that included slightly different placements of the temperature sensors and LEDs, but we decided the best placement was having the sensors face each other, with the LEDs parallel and in the middle of the box.

FABRICATION AND PRODUCTION

 

It was directly after the User Testing Session that we decided to start with the 3d fabrication process. We had already designed our laser cut box, but it was a team decision to leave the actual fabrication process until after user testing so we don’t waste any materials remaking our project in case we had run into any major problems during the testing session. One change after user testing is we decided to include instructional text on our box that alerts the user that the closer the color of the LEDs, the greater the compatibility. It was during the testing session that we realized there was no direct information about the meaning of the color of the LEDs unless told to the users, however we wanted our project to be able to function without explanation so this text gave the users the information they needed in order to understand the context of the LED’s output. After testing, another alteration that we wanted to add to the design was the addition of a picture of a fingerprint on top of each sensor. We also added this for a similar reason of having more obvious directions for how to interact with the project. With these pictures placed on top of the sensor, there is no question as to what the participants should do. I believe both of these adaptations were highly effective because they facilitated the audience’s comprehension of the project’s meaning as well as the functionality of its interactive components.

In order to incorporate the fabrication processes in our project, we decided to laser cut a box in order to contain all of the electrical components. We ran into an issue here, however. We decided to wait until the day before the project was due to laser cut this box and right before we attempted to cut it, the machine stopped working. Although the laser cutter soon was functioning again, this entire experience made both my partner and I extremely nervous and definitely taught us a lesson about procrastination. Especially working with technology, which has the ability to break down, or be finicky, it is always helpful to be ahead of schedule in order to avoid situations such as this. We had another issue in the fabrication process in our project. We had wanted to 3d print a heart on a stick. The heart would sit atop the box while the stick would go through a hole and connect to a reset button for our circuit. However, even after scheduling a 3d printing appointment, the heart just would not print correctly. Still not sure about why this didn’t work as it was tried multiple times, but from my own perspective, I have definitely learned, when working with fabrication processes, to do them all very early on as these machines are not as reliable than I had thought. The fact that the heart would not print meant that we could no longer use the reset button and we had to manually reset the project every time a user wanted to interact with it, which was not ideal, but we had to make do with the resources we had.

CONCLUSIONS

To recap, the goal of this project was to create an electrical circuit with the Arduino that encourages interaction following my own personal interpretation of what interaction truly is. Another goal of the project was to use some form of digital modeling in order to design the circuit. I believe that our project really aligned with a lot of aspects of my personal definition of interaction. It involves multiple lines of communication and the transferring of information between both of the compatibility test participants. It also involves the transfer of information through the temperature sensor, and then the transfer of values from the temperature sensor to the LEDs. The color of the LEDs then encourages a reaction to the participants as well as between the participants based on their level of compatibility. This entire process encourages the audience to feel inclined to put their fingers on the temperature sensor, and then react excitedly when they receive the news of their compatibility. In reality, all of the audience figured out how to interact with the project, however, the reactions varied intensely. Some were extremely excited with the project, and some didn’t react very much when the LEDs turned on. I’m not sure if this is just their personality, or if there are ways to invite more interactivity. However, even though I believe there is a healthy level of interactivity, I believe that we could have incorporated a few more components, given fewer time constraints, such as a buzzer that would beep if the temperature values were close enough. This would allow a fine line to be drawn on what level of compatibility is present between the testers in order to release any confusion about the meanings of the LEDs’ color. I was also thinking we could use the same buzzer, and have it play a certain tune based on the degree of compatibility. Such as three beeps for lovers, two beeps for friends, and one beep for strangers. I believe the incorporation of both a visual and audio output would invite even more of an emotional response from our audience so if revisited in the future, I would love to incorporate such. Going back to the setbacks involved in the laser cutting and 3d printing process. I am a bit disappointed we couldn’t 3d print the heart reset button, however, I’m also glad that my partner and I were able to experience the challenge of technical difficulties in order to encourage us to approach such technologically heavy processes with greater leniency. From our various issues with technology, I definitely feel more prepared for dealing with these machines in the future. From our accomplishments, I definitely have greater confidence in my ability to code and work through different programs. I also feel much more comfortable in a group project setting, as my partner and I worked together very well. In all, I’m glad that our project turned out the way it is. We were able to take a simple idea and execute it neatly in order to construct an effortless symbol of interaction. There are always improvements that could be made with more time, but for all of our effort in coding, modeling, and circuit building, I am content with the result.

Schrödinger’s Cat- Theresa Lin- Marcela

Project Name: Schrödinger’s Cat (Decision Maker)

Name: Theresa Lin

Partner: Frances Yuan

Instructor: Marcela

Motivation

Both Frances and I are really indecisive and always have a hard time deciding on things such as what to eat. We also noticed that our friends are all also really indecisive when it comes to planning on what to do together. We wanted to make a decision maker that was more fun and engaging than your typical decision maker website.

Inspiration

We stumbled upon this video of a Schrödinger’s cat decision maker and thought it was really intriguing and has that level of engagement we were looking for so we wanted to re-create it. https://www.youtube.com/watch?v=B7sKqlM4LZg&t=1s  In their project they reveal the cat by shining a light on it, we decided to put the cats on a surface on a servo and make it rotate randomly.

This video provides a clear explanation of Schrödinger’s Cat. https://www.youtube.com/watch?v=OkVpMAbNOAo

Concept Story 

Project Sketch

Tools Used

Illustrator
makercase

Materials Used

2 Breadboard
1 Arduino
1 Servo
Gears
Straws
1 Motor
Wires
1 H-bridge
10k resistor    

Prototype

Servo for the two cats to rotate on 

User Testing Feedback

During the user testing, one thing we noticed was that people were a little confused as to what to do first. They didn’t know they had to ask it a yes or no question. I think it might be because we didn’t specify that it was a decision maker. People said the window of the cat wasn’t the first thing that they were looking at so they didn’t know where to look after pressing the button. Some users also said that they weren’t sure when to open the door to look at the cat. A lot of people liked how the box had cat ears so the entire box resembled a cat head.  One thing I thought was interesting was that some people would ask questions like “Will I pass my midterm?” instead of questions that required decision making. Before the user testing we didn’t think it could also function as a sort of ‘future predictor.’

Final Product Process

Based on the user testing feedback we received, we thought it was be best to have a door that would automatically open when the user presses the button. For the automatic door, we had to laser cut gears. At first we tried to use Gear generator for our gears but we had to pay to download the gears so we tried to find other alternatives. We ended up downloading a picture of gears and then tracing the image in Illustrator to change it into vectors.

We tried to think of the best way to design the box so that the user’s attention will be on the door and the button at the same time. We also thought to include some instructions such as a step one and two so it will be more clear as to what to do. We decided to put the red button right beside the door so when users press it, their attention is already by the door and will immediately anticipate on the door opening. We decided to make the box taller than our prototype so it’s easier to see by placing the door towards the top. Although many people liked the cat ears on our prototype we decided to leave it out on the final version because we felt like people kind of expected the box to have a cat face and suggested that we change it to a cat’s face with expressions expressing ‘yes’ or ‘no.’ However, we wanted to stick to the Schrödinger concept and left the ears out.

We also decided to put the words “Yes” and “No” with the corresponding cats because some people couldn’t tell whether or not the cat they were looking at was dead or alive. We used this picture of the cats from Shutterstock and added the words on Illustrator.

For the automatic door to slide, we used a straw and wires at first. However, the wires were too small compared to the straw so when the door slides up and down, the door would sway from side to side and the gears wouldn’t line up.

We ended up going to Family Mart to see if they had a narrower straw or something sturdier and thicker that the wires. Fortunately, they had both a smaller straw and these wooden picks. It fit perfectly and prevented the doors from swaying.

Challenges/Obstacles

One of the most challenging things was making sure both of the gears would line up. At first the position of the motor was a bit off, so the gear wouldn’t match up after they turned. We had already glued the motor onto the box, so we had to cut off the hot glue and re-glue it. We also noticed that the hole we made for the buttons were too big, although we were sure we measured it correctly. We ended up putting the buttons through a piece of cardboard so it would hold the buttons up.

After we were done building the circuit and and gluing most of the box, we noticed that the box was potentially too small for all the circuits. We ended up replacing a lot of the wires with shorter ones for the breadboards and Arduino to all fit into the box.

Conclusion: Reflection 

I think the automatic door was definitely a positive addition that we added in response to the feedback we received during user testing. Making the box taller and placing the button right next to the door were also positive changes. Our users definitely were not confused as to where to look because their attention are drawn to the red button first and then immediately onto the door that’s opening right after they press the button. I learned that it’s best not to assume that everything will work smoothly and then start working on the project at the last minute.

Demo

We also had our friends try it.

Grass-Liyu Chen-Eric Parren

Context and significance

The previous group project has made me think more in terms of making the output of the interaction perceivable in more dimensions such as looking, hearing, touching, even sniffing. Before starting the projects, we searched tons of interactive designs. A very interesting design is a dog that could turn its to face the user when the user is looking at it. When the user is trying to pat the dog, it would shake its head to avoid being hit. I think this design has made me reconsider the emotional effect an interactive device should bring. For example, after seeing the interaction with the dog, I felt very amused, for it is funny to see the dog reacting lively to my physical activity. What also left a deep impression on me, yet not viewed as an interactive design was the LED cube. Though the device could make very cool light effects using LEDs, the only thing the user could do is looking at the light effects. In other words, there is no input or output in this interactive design at all. That alerts me an interactive design should be interactable before adding other cool features or effects.

My understanding of the interaction, therefore, has a strict and clear boundary. Just as what I wrote in the last blog post for the group project, I was expecting a design where users could have untraditional way of inputting. After the computation, the design would present its output in different ways for users to perceive. The output users perceive, then, should make the user feel emotionally different.

What makes my project unique is the representation of tones is not limited to its sound, but also through the turning on and off of the LED, the textile feeling of touching the optical fibers in the shape of grass. In general, it could make this instrument more approachable, and users could get a more natural feeling when playing music. What I would like to redesign is the entrance door of the library. This is an auto door during library time. When the library hours have passed, however, the door wouldn’t open without swiping the NYU ID. But it’s normal that people forget to bring along their ID cards when trying to get in or out of the library, making them have to either go back to get their card or to wait till someone swipe their card. If I were to redesign the door, I would replace the card swiping machine with a facial-recognition machine. Students only need to “swipe their faces” to get in or out of the library, an act to make their stressful life easier. The target audience of the project is certainly the whole student body of NYU Shanghai, as the majority of them are suffering from the entrance door.

Conception and design

As I mentioned above, I expected the users to not only perceive the tones through hearing, but also through touching and watching, thus bring about more immersive experience playing music. It should also be a more approachable musical instrument compared common instruments.

In the actual design, we used the optical fiber to replace keys or strings on common musical instruments. These optical fibers are arranged in the shape of grass. Users only need to push or shake these optical fibers to make tones. We chose optical fibers out of two main reasons, its shape resembling real grass, its ability of transmitting light from LEDs at its bottom. Copper wires could also be an option. We rejected it because it cannot transmit light coming from the LEDs at its bottom. It was rejected also because it doesn’t fit into the aesthetic scene we set.

Raw Material for Optical Fibers
Raw Material for Optical Fibers

To indicate the on and off of the tone, we used 30 white LEDs to shoot light through the optical fibers to create shiny white dots at the top of the optical fibers. The LEDs would turn on slowly after the user the grass and would fade away when the user let go. In terms of other options, during the user testing session, many have asked whether there would be other colors for LEDs. We rejected this idea, for it would again, ruin the aesthetic feeling the optical fiber creates. There were also people suggesting using LED matrix or strip to replace individual LEDs placed on the breadboard. Some other people also suggested using the LED matrix or LED strip to replace bread boards with LEDs on it. However, the optical fibers are arranged in a special way that the position of LEDs needs to be flexible. The light matrix certainly cannot meet criteria because the position of each LED is fixed.

Expected Effect of the Grass
Expected Effect of the Grass

In terms of the sound, we attempted to play the sound of harp using speakers. The pitch ranges from C to G. This time, no other options arose for the sound of the instrument we are using. As tones played by harp is so harmonic that it perfectly fits into the scene we set.

Speakers we used
Speakers we used

Fabrication and production

The first important step we made was making two sets of LEDs. Each of the set has a button to control the on and off of six LEDs with fading effect. The LEDs would stay bright as long as the button is pressed. When it is released, the corresponding LEDs would also fade away. It was a bit hard to achieve such function in coding, as the Arduino would be running in a loop after it is powered up. Therefore, I have to introduce certain variables memorizing the on and off of the light after a loop is executed.

Attempting to use the Arduino to play the sound of the harp was also an important yet struggling step. To make it play such sound, we have to make the Arduino read external sound files. Currently, there are two options to make the Arduino play sound files, one is to store the file in Arduino’s internal memory, the other is using the SD card as storage. We first tried to use the internal storage of the Arduino. But it was too small to even store two sound files while we are trying to store five. As for the SD card plan, we tried two kinds of SD modules, neither of them worked. When trying the module by DF Robot, the files were read successfully, but the volume was so small that one has to put the speaker on his ear to hear it. It didn’t work even with an amplified circuit. We then turned to try the MP3 SD card module. Instead of using the speaker connecting to pins, this module uses a head jack to connect to the speaker. Unfortunately, this option also failed because the SD card is only capable of reading one file from the directory at a time, while we were expecting it to read multiple tones at one time. We ended up going back to the Stereo Enclosed Speaker Set to play tones using the tone() function.

SD Module from DF Robot
SD Module from DF Robot

Transplanting LEDs onto the amplified circuit was also an important step. Because we need to power thirty LEDs, the internal 5V power supply from the Arduino is far from enough. Instead, an amplified circuit is needed. After the circuit is changed, the LEDs couldn’t light up after the buttons were pressed. Nick helped me out by reconnecting everything in the circuit. It turned out I made the wrong connection because of the messy circuit.

Amplified Circuit for Speakers and LEDs
Amplified Circuit for Speakers and LEDs

After the struggles above, we finally reached some success. Before user testing, we assembled two sets of LEDs using the physical button to trigger the on and off of each unit. When the user push or shake the grass, the physical button beneath would be pressed by the pillar under the platform holding the optical fibers. A series of things would happen subsequently.

Another failure, yet interesting step we encountered was after the user testing, where Nick suggested us using the conductive tape as switches, turning the set on when the bread board is touching the shield. However, it was so weird that the LEDs would light up even when there is no physical contact between tapes. After testing and thinking for a morning, it turned out that this was caused by magnetic induction. One part of the tape is connected to the 5V and has circled the bread board, and the other part of the tape is also circled in the shield. When the bread board in the shield is moving, the magnetic flux would change and produce electricity to make the digital pin receive false message of the switch being turned on. All the light and sound would be made consequently.

Apart from above, we have been encountering the 3D printing failure throughout the process. Even after the first and second layers are printed successfully, there are still high chances the printing would fail.

Failed Printing
Failed Printing

During the user testing session, some users were had no idea how to interact with the grass. The common scene was the grass being very unsensitive to users’ pushing, shaking. In fact, users have to push down the grass to trigger the button sets. Some asked what are the optical fibers used for. A lot of users also asked whether there will be any other colors for the LED.

As for the unsensitive switches, we consulted Nick for an alternative. Nick offered us that method of using the conductive tape to act as switches. When the user pushes the grass, it would make the whole bread board move. And the conductive tape on the bread board would make contact with the conductive tape on the surrounding areas on the shield. After encountering the difficulties mentioned above, we cut the tape to one small piece sticking at one side of the bread board and the inner part of the shield. This switch was definitely more sensitive than the physical switch. But because we are moving the whole bread board with optical fibers and stuff on it, the set always has problem repositioning itself to the right place. In other words, it became too sensitive.

In terms of how people asked the meaning of grass, we thought it was because the light is not coming through the optical fiber to hit the top of the fibers. We first removed glue covered at the bottom of the fibers. To make the light of the LED go precisely to the bottom of the optical fibers, we 3D printed some sockets to line up the bottom of the optical fibers and the top of the LED. And it was successfully addressed.  

Apart from the suggestions received from the user testing, there were also numerous design changes we made during the production. A significant one involves the platform holding the optical fibers and the number of the LEDs. To attain the best visual effect, we originally planned to use nine LEDs in each set and arrange them in a circle. But it turned out this would use too much of the optical fibers and the circuit would be too complicated to assemble on a bread board. Thus, the number of LED was set down to six, and they would be positioned in a parallel way.

Schematic for Nine LEDs
Schematic for Nine LEDs
Holder Prototype
Holder Prototype

There were also several choices for the material of the platform holding optical fibers. We originally planned to use 3D print these platforms. But given the prototypes we printed, 3D printing wasn’t efficient, and it doesn’t look well-designed. Thus, we turned to use the laser cut and put two of them together to form one platform.

Grass Holder (Laser Cut)
 Grass Holder (Laser Cut)

Conclusion

The goal of my project, as mentioned above, is to create an instrument enabling users to perceive music not only through sound, but also through seeing and touching. It is aligning with my definition of interaction because the interaction involves a clear process of having the user input by physical action and present the output in multiple ways. Most importantly, users were amazed by the beautiful visual effect when they are trying to make sound with the device. What this project failed to align, then, was it is not easy to interact with. Because the switches are usually too sensitive, the user have to reposition the whole bread board set after interacting. This could result in huge inconvenience for the users. And like some users mentioned after the presentation, one could hardly produce real music with such device. In the end, the audience were not able to remember the direction to push the grass, as each grass set is triggered in different direction. Without knowing some “tips” on repositioning the grass, some grasses keep going off after the user’s operation. If I have time to improve this project, I would make each tone controlled by one Arduino with speaker and LED set. And I’ll make seven of them to attain an octave. What’s more, I would use joystick to replace the conductive tapes. Whenever the joystick is not at its original position, it would send a signal indicating back to allow following activities.

My biggest take-away from my failures is that “功夫不怕有心人,铁杵磨成针”(Progress would be made with lots of trials and errors). As for my accomplishment, I have learned to deal with malfunctioning 3D printers, learned that Arduino is worst for playing sound, the differences between different Arduino boards, etc.   

You Will Never Catch Me Mom! Justin Wu (Marcela)

You will NEVER Catch Me Mom! Justin Wu (Marcela)

Before this midterm project, we all had to participate in another group project. In the initial group project, we create a futuristic product that can help users identify skin issues while also offering different product samples for testing. In that project, named “iMirror,” it required a human user to interact with the artificial intelligence inside the mirror. After watching my group mates try to demonstrate the interaction and watch my peers talk to another peer behind the mirror (acting as the AI), it triggered me to further my understanding of interaction. Initially, I understood interaction as two parties bouncing ideas, theories, movements off of each other and acting on one another’s decisions but after the group project, I understood interaction should also create a (1+1>2) effect. Interaction should create a more significant impact than addition; interaction should bring something new to the table. On that note, we decided not to recreate any of our previous group projects, and we decided to recreate our childhood memory together. Our new project and concept are different from the other projects as it does not pay tribute to our contemporary lifestyle or an imagined future, but instead, it pays homage to a shared memory we all share. Our group project is meant to be for everyone, and users (targeted audience) will be able to relive what it meant to try to play the late night hide and seek game with our parents in order to stay up late to watch television shows, to play games or to do anything that we were not allowed to do.

How we envisioned our project

   

After concluding that our project should be surrounded by nostalgia, we quickly drafted different plans for our project. We finally settled with the idea of mimicking a kid trying to stay up late for a variety of reasons. As a kid, we always wanted to play an extra hour of video games, watch some more cartoon shows or play with our toys for longer but our parents always made us sleep early. Any kid in this situation would be very frustrated by their parents checking on them at night; therefore for this project, we wanted to implement automatic light sensors to help kids stay up later. It is a simple idea that most of us did not have as a kid, but we want to recreate the nervous memories. To use the automatic light sensors, we designed our project with a long hallway with a LED light in between the two bedrooms. We also decided to use a lego character to provide a three-dimensional feel. Because we chose to use the lego character, we decided to make a lego handle (that the lego character will stand on) that users will use to walk the mom to the kid’s room. To do that, we needed a thin material that we can laser cut the outline of the handle. After going through the available materials, we decided to choose the 3-millimeter wood panels so the laser cutting process will be more efficient. The wood panels provided a more homely feeling than the acrylic panels while also being more time efficient. On makercase.com, we also decided to make an open box with finger edge joints instead of the flat or t-slot joints, so we had more flexibility on how we want to use the case after printing. By using finger edge joints, we also make sure we can combine the foundations of our house by connecting the bones.

Laser cutting finger edge joint boxes

The most physically demanding of our midterm project was the fabrication and production process as it required precise measurements and meticulous planning. During the User Testing Session, although we had a working prototype, we did not have any fabrication. Many users pointed out the vague directions and poor design did not contribute to their experience. Therefore, after our user testing session, we immediately started to plan on creating a presentable project, and we created a list of objectives. First, our initial prototype was constructed with paper and plastic, and it was not a neat design. Hence, following the session, I consulted one of the teaching assistants, and I decided to use makercase.com to help create our house as it was going to be an easy method to solve how we configure our house. When it came to creating fabrication, we had many options by using makercase.com. After figuring out the dimensions of our house, I consulted groupmates Roger and Julie regarding our configurations. We realized, to improve our project, we needed a complete design that will be able to accomplish many objectives. First, we needed a two-story house to store the Arduino and breadboards on the first level and have the bedrooms stacked on top of it. Second, we needed specific measurements for our lego figurine to move freely. These ideas would all combine to help us create a newly constructed house.

Sketch of new design

 

Second, we also had to address the manual reset problem. During the testing session, many users were confused by our need to reset the project manually. We were also frustrated at ourselves that we did not think beforehand. Many users would try to move the mum to the kid’s bedroom again before we could manually reset the Arduino. We desperately needed to create a loophole that will be able to automatically reset the Arduino whenever the mum walks back to her bedroom for a coherent experience. With the help of the teaching assistants, we decided to create different stages that signify different parts of the experience. The first stage is when the mum was in her bedroom or before the light in the hallway, and everything was calm and peaceful. The second stage is when the mum blocks the LED light, this stage would trigger our codes and would turn all the lights off while also triggering the kid to fall back into his bed. The third stage is when the mum reaches the kid’s door to check on the kid. The final step is when the mum walks past the LED light to return to her bedroom. Stage four would trigger our code to respond to stage one and automatically reset the whole project.

Updated model with automatic reset

In general, we tried to recreate a fraction of our childhood memory by recalling how we all tried to dodge our parent’s supervision with this midterm project. Our project incorporated interaction as users not only react differently to the different stages of our project but also get a sense of nostalgia, something that is not expected. The project aligns with my sense of interaction as in addition to enjoying the experience; users get to comes across the (1+1>2) effect once they recall how they used to act the same when they are a child. During our group’s short presentation, many people were fascinated by our idea and were eager to try and test how we incorporated our lives into a real-life model. It was incredibly relieving to see our audience not only inquire how we made this project, the intricate details behind it. It was also rewarding to see people resound with our motivation to create this project because we managed to make our users think back to their childhood memories. On that note, if we had more time, I would try and create a losing scenario for the kid. In our current model, the kid can dodge the mother’s checkup every trial, but it would make our model more realistic if the kid can get caught because we have all been caught before. During these two weeks, our group went through a series of highs and lows, and it was both rewarding and punishing to be there for all of it. However, I took away the importance of splitting tasks according to our different expertise. As a part of a three-person group, we were able to divide and conquer. Each of us took care of different tasks efficiently, and it helped expedite the process.
Most importantly, I also learned the importance of staying patient. During our fabrication session, the laser cutting machine stopped working, and we allowed our emotions to get the better of us. I started to panic and wonder if we will be able to complete our project. However, with the help of Leon, we managed to get the laser cutting machine to work again, and we resumed our fabrication process. In short, our midterm group project brought us back to our childhood memories; only this time it was a lot more demanding and challenging to create it. Although it might seem like just another model, I believe our users should care about this experience because it displays not only detailed planning but also noteworthy programming and coding skills. Most importantly, this model provides an experience that coincides with everyone’s early days.

Final Project