During the group and midterm project, I defined interaction as a ” back and forth between the user and the machine … it has to respond to function with its own unique reaction, separate from simply the commands fed into it.” However, now that I’ve had experience not only designing but actually creating an interactive product, my definition of interaction now focuses a lot more on user experience: how does the interaction feel between the user and the technology, and does the interaction have its own autonomy in responding and recalling? User experience is something I felt I neglected in my definition, and I felt that showed with my midterm project, as with the bird the product was technically interactive but the interaction still felt cold and very impersonal. I hope to focus on interaction’s capability for expression more going into the final project.
Looking at the new focus in my definition for interaction, I looked at two projects. One that I thought was super cool was Nota Bene’s In Order to Control. Essentially, there are unique phrases relating to legality and morality projected onto the floor, and when any random person steps on it the art projects a silhouette of that person comprised of the unique words or phrases they are in. It is meant to emphasize how slippery morality and ethics can be, and I feel that it aligns with my new definition of interactivity because the interaction feels meaningful and it feels there is an articulated message. Often, interactivity can feel very meaningless and confusing, it is when there is a purposeful attempt at expression can the interaction feel less robotic.
On the flip side, one project that does not align with my new definition of interaction that would’ve maybe aligned with my old understanding of interaction is “Sleep Art” by Ibis. In this art project a chain of Ibis hotels teamed up with ACNE production and BETC Paris to create a machine that records the data of your snoring and sleep movement and turns them into a painting based on your unique motion. Back during midterms, I would’ve called this interaction because there is a clear communication and response, the response is unique and not a repeatedly inserted command — there is a clear output that is shown through the art. But what does not align is the form and the purpose it takes: the interaction feels ultimately meaningless and confusing because the expression of the interaction is lacking. The machine feels cold and distant from the user, and without a clear and coherent message the painting that is ultimately created feels very random and machine like as well.
Given everything discussed, I would propose my new definition of interaction is a purposeful and well communicated conversation between user and technology. This definition still adheres to Chris Crawford’s βThe Art of Interactive Designβ understanding of interaction as βa cyclic process in which two actors alternately listen, think, and speak.β But I will, instead of focusing solely on the “cyclical” nature of the process, I believe I will focus more on the “alternate” conversation that goes on between user and machine in interactivity.
For this recitation I chose a piece by Vera Molnar — it is a little chaotic to look at and yet the strange sense of order appealed to me, so I decided to replicate it in Processing. It also looked simple enough to make, as it comprised mostly of squares.
I wanted to use multiple similar sized red squares that overlapped one another. I started out using quad(), because I thought that the freedom would give me more room to alter the shape of the squares, like Molnar did in the photo, but not only did it get really annoying to plug in each individual coordinate for each square, an IMA fellow helped me realize that rect() would be a lot easier to use and replicate for Molnar. So ultimately I used rect(), stroke (), rotate(), fill(RGB), and color() to replicate the drawing.
It linked to the same motif of order through chaos Molnar was trying to illustrate through her drawing because the part I did make was a little messy and random to look at. The alterations I made between squares were random, some due to mistake, as some of the squares were less squares and more just irregular quadrilaterals, and I also had to rotate them randomly as well. It is different because I feel that Molnar is definitely had more technique, as despite the fact the point of the drawing is to illustrate a sense of shapelessness and frenzy, there is a precise method in which the shape in the square drawings are formed.
Processing I feel is a good way to recreate this drawing, because there is something robotic about the photo, of the same, frenzy pattern being repeated so often yet in such a systematic manner. However, I wish I knew how to make the squares repeatedly draw on top of one another with slightly different iterations on Processing without having to put in the code for each individual square. If I had more knowledge and more experience with Processing I could see how it could be an effective way of recreating Molnar’s drawing.
The purpose of this recitation is to apply this week’s lecture on electricity and circuitry, and how to attach components such as sound and control. The goal is to build three circuits: one with a “doorbell” or a switch, one with a lamp, another with a lamp that can be dimmed on and off.
Components:
1*Buzzer: Creates sound with electric current flow.
1*100 nF Capacitor: Stores electric charge.
1*Switch: Builds or breaks the connection in the electric circuit.
1*LM7805 Voltage regulator: Transfers the voltage level. Used to transfer 12V into 5V in our circuit.
1*12volt power supply: Provides 12volt power for our circuit.
1*Breadboard: A base board for us to build the model of electric circuit.
Several jumper cables: Build connection between components in the circuit.
1*Barrel jack: Connect the power supply and the circuit.
Step 1: Doorbell
Building circuit 1 was actually not as hard as I initially thought — I just followed the diagram and it was relatively straight forward. The doorbell really helped me understand how to read an electrical diagram, as before I couldn’t really understand what the symbol for a resistor was and where I would plug in the circuitry for ground.
Here is a picture of the circuit:
And here is the video:
Step 2: Lamp
Lamp was also not as hard since it was a fairly simple circuit — similar to the doorbell except instead of a buzzer it was an LED light with a resistor.
Here is a picture of the circuit:
Here is a video of the working circuit:
And here is a video of the circuit with the arcade button instead of the simple pushbutton.
Step 3: Dimmable Lamp
The dimmable lamp is where I ran into a little bit of trouble, because I didn’t know how to position the potentiometer. Turns out I was turning in a way that the electrical currents for power were flowing to both power and ground, so ground had nothing to connect to. After asking help from a fellow I figured out to turn the potentiometer the right way and got the circuit to work.
Here is a picture of the circuit below:
And here is a video of the circuit:
Documentation Questions:
I think these circuits are interactive to the extent that there is “input, processing, and output.” For the lamp, the doorbell, and the dimmable lamp, the user interacted with it by pushing a button or adjusting the brightness, and there was a clear processing and visual signing of that output. However, I feel like the interaction is too direct to truly be communication between user and product, which is what I think true interactivity is. For example if I wanted to make the light flash and turn on at the same time that would be cool interaction, or if I made the doorbell make different sounds based on the number of times I pressed it. Interaction has to be unique between the user and the product to be interaction, and I think the interaction with the circuits is too straight forward.
Physical computing are greatly important to interactive art, because interaction is how the user engages with the art, and physical computing determines how it responds. Zach Lieberman said that physical computing allows people who otherwise wouldn’t have access to art, such as disabled people, and I think that is really unique and cool because while interactive design isn’t strictly art as it is code, it allows people to create art through computing, which connects science and art in a really unique and interesting wya.
My previous group project “Dreamie Beanie” taught me the importance of prototypes: especially since back then we did not have to explicitly make the prototype work, managing our expectations despite having no technical limitations taught me how to be more realistic when going into the midterm project. We needed to brainstorm with four other people how to technically make the project work even though we could use our imagination to recreate the functionality — so for example despite only having to make a beanie and a cardboard TV, we still had to figure out how hypothetically the beanie would connect to the monitor, and how the beanie would connect to the user’s brain. Working with four other group members also helped me learn how to compromise my expectations along with my partner so we could make a working project.
The last group project taught me that interaction required repetitive feedback, so our project “Briend” or “Bird Friend” focused a lot on that: the bird was supposed to be able to respond to user interaction, not only through simple command but through repetitive action from the user. Briend was created as sort of a spin on the “emotional support animal” people have today, except it was electronic. Our project was mainly intended for kids — just someone who would get anxious without a companion and could pet it to receive some kind of validation or comfort. I feel the responses the bird could give simulates a type of comfort you would get from petting a real bird or a real dog: we wanted to make it as lifelike as possible, which is why I felt our audience would appreciate our product.
CONCEPTION AND DESIGN:
In what ways did your understanding of how your users were going to interact with your project informed certain design decisions you made? Account for several such decisions. In addition, what form, material or elements did you use? What criteria did you use to select those materials? Why was it/were they best suited to your project purpose, more than any other one/s? What might have been another option? But you rejected itβwhy ?
If we are being honest, our first design conception was a flower. We were going to attach a light sensor to it so that it would open and close based on bright the light was. We at first were going to use 3d printing to print the flowers, and use wires to each flower attached to a main stem so the main mechanism would push the flower open and closed. This design was based on one we found on the internet, and we decided it would be easier if we replaced the metal hinges with straws.
Here are some of the initial drawings we made in designing the flower:
We wanted to create a design that was simple, polygonal, and easy to 3d print, yet was sturdy enough to withstand the swinging motion needed to make the flower move. Here is the mechanism we wanted to recreate, a design we got from the initial website.
However, when our plan had to change (as I will detail in the Fabrication and Production section of the blog post) we wanted to create a bird design that was simple and resembled that of a music box (we initially wanted the bird to play music when the person came near it). We came up with the music box design firstly because it would be easy to laser cut, and secondly because we felt that music boxes gave us a lot of comfort when we were younger and wanted to recreate that feeling through a singing bird.
We also wanted to maintain the same up down mechanism we were going to use with the flower, but this time it would be much more simple because it would only be controlling two “petals” or “wings” instead of six interlocking petals that had to open and close smoothly at the same time. Other than the box shape we wanted the bird to be as lifelike as possible, and wanted the bird to light up to bring a visual sense of happiness when the user interacts with it. We wanted to use feathers because it would make it seem more like a real bird. Here are some initial drawings and designs for both the laser cut box and the bird itself.
FABRICATION AND PRODUCTION:
In this section, describe and assess the most significant steps in your production process, both in terms of failures and successes. What happened during the User Testing Session? How did your user testing process influence some of your following production decisions? What kind of adaptations did you make? Were they effective? In short, keeping in mind your project goals, how do you account for and justify the various production choices you made for your project?Include sketches and drawings.
Our biggest failure for sure, as mentioned above, was having to completely scrap the flower idea. For two days out of the week before the midterm project we were 3d designing the flower petals, trying to make the stem using wire. The first flower petal turned out way too big and the 3d printer could not print it out. The second one however the shape wasn’t right for our hinges to attach to, as shown below.
And when we realized we still didn’t fully understand how the mechanism worked, we tried to recreate the mechanism using cardboard instead. It did not work and we could not even get the wires attached to the motor.
Once the flower wouldn’t attach to the motor we fully gave up, and asked Rudi for help. Rudi, in helping us try to recreate the mechanism of the original flower, showed us a way we could easily recreate the mechanism of the flower moving up and down but in a much simpler way: by taping straws to cardboard and having the motor push the wire through the straw.
Here is the video of our first prototype:
The motors moving two wires through straws gave us the idea for a bird box, in which the motors moving the two straws would be the wings. Once we finally had the idea for the main mechanism, we went to design the actual box it will be in.
We decided to have the bird box be smaller, firstly so the wings would move more fluidly secondly because it would be more convenient to carry. The box ended up being a 13x10x10 cm box, and we laser cut a bird design pictured below:
We wanted the wings to be light, small, yet big enough so the motion of the up and down wings would be visible. We still used the wires, creating two big loops at around 5 cm in length, and glued feathers onto them. We attached the wings to the motors, and at first when the bird moved, the wires kept getting caught in each other. So then we realized we had to move the wires closer to the middle so they wouldn’t get tangled, and had to make the motor move faster. So while we wouldn’t get the range of motion we got before, the speed would make up for it. Ultimately, we finally got the bird to make an up down motion with the touch sensor motor.
The birdbox shown was what we brought to the initial user testing. Making the bird flap its wings in time with the LED lights shining took us all week, so we did not have enough time to figure out how to make the bird sing or chirp. So during the user testing much of the feedback had to do with the interactivity: the bird was too robotic — the touch sensor was visible and all the bird would do is respond to the user’s button press and it would flap its wings. Despite us cutting holes in the box for the LED lights to be visible, the lights did not show through clearly at all, so really all it looked like is the bird would flap its wings once the user touched it. One feedback we really liked was using the feathers to cover the wings and instead having an arrow or a sign that said “Pet me!” which would make it more toy like. Another was that when you came near the bird the bird would react in some way, like chirp or flap its wings faster. We implemented those two comments into the design after user testing.
Firstly, we used a smaller breadboard and attached the LED lights to the top of the bird instead of putting them inside. Secondly, we used toneMelody in combination with the distance sensor to create a bird chirping sound: we wanted to create a song but did not have enough notes or memory for it, so we instead decided to do a two note chirp. We also added a face to the bird so it would look a lot cuter: if you put a face on anything a human being will attribute emotions and feelings to it, so we felt it made the bird look a lot more personable.
The final video product is shown below:
CONCLUSIONS:
Begin your conclusion by restating the goals of your project. How do your project results align with your definition of interaction? How do your project results not align with your definition of interaction? Based on both your definition of interaction and your expectation of your audienceβs response, how, ultimately, did your audience interact with your project? How would you improve your project if you had more time? What of value have you learned from any setbacks and/or failures? What do you take away from your accomplishments?
The wrap-up of your conclusion delivers your final say on the issues you have raised previously, demonstrates the importance of your ideas, and insightfully synthesizes your thoughts about key elements of the project experience. As such, your conclusion should close by reflecting on the crucial questions βSo what?β and βWhy should anybody care?β
Ultimately, our goal was to make continuous interaction with the user: we didn’t want a simple computer that would input and understand commands, and we felt like we achieved that. The bird responded in a lifelike manner, and the interaction didn’t stop with a simple command such as “on” or “off.” However, we still felt like we could’ve done more with the motion of the wings and with the sound of the chirp: if we wanted to make this project more life like and more “align” with our definition of interaction being a continuous communication, we would’ve made it so the wings would move faster if it was “happier” or if it was petted more often. We also would’ve changed the tone so it sounded less robotic and would change depending on its “mood.” Like if you approached the bird from behind the bird would make a scared noise, but if you pet it it would make a happy sound.
At first during the user testing, the audience’s interactions with the project were very robotic: they simply touched the touched sensor and the bird responded: there was no sound or no visual cues of their interaction, so they felt the interaction was a little cold. However, once we added the feathers, the lights at the top, and the sound, interaction with the project felt a lot more fluid, as users felt like it became more lifelike under the changes. This entire project really taught me how to manage my expectations: at first my partner and I wanted to make a fully functioning robotic flower, but it wasn’t until Tuesday night when we had to change our entire project that we realized that we couldn’t just simply based off of the knowledge we had. But I felt like we adapted really well to the failure and still used core aspects of our original project yet made it simpler and more unique based off our experience, and this project really taught me how to adapt from failure to make a cohesive product.
Overall, this project taught me a lot: it taught me how to engineer a product not only on the software side, but also on the hardware side, ensuring each piece could function off of the other. It also taught me that it was important to adapt from failure and use that failure to expand creatively, and that there is never an end to any project. Even when I was so relieved when we finally made the bird flap its wings up and down, at the user testing people were still dissatisfied and had a lot of good feedback for how we can continue to improve the bird. My experiences with the midterm project taught me how to continuously improve, and for that I find this project very valuable.
In this recitation, we continued exploring Arduino functions with motors by playing with higher voltages. For this session we had to build two circuits and one drawing machine, in which I was only able to finish the two circuits. We switched out the Servo motor for a more high powered stepper motor, which gave us more potential to power heavier and faster objects with the Arduino.
Components:
1 * 42STH33-0404AC stepper motor
1 * SN754410NE ic chip
1 * power jack
1 * 12 VDC power supply
1 * Arduino kit and its contents
2 * Laser-cut short arms
2 * Laser-cut long arms
1* Laser-cut motor holder
2 * 3D printed motor coupling
5 * Paper Fasteners
1 * Pen that fits the laser-cut mechanisms
Paper
Step 1: Build the Circuit
Despite its initial difficulty understanding what went where with the first circuit, building it actually wasn’t a big problem. I just followed the circuit and plugged it into the lines it was supposed to go. I asked a fellow to double check the circuits so that I don’t fry my computer and the Arduino board, and then after, by pasting the code from the Arduino website I was able to make the motor make one full rotation.
The video is attached below:
Step 2: Control Rotation with Stepometer
It is the second step in which I started to run into problems. Firstly, I had a little trouble trying to attach the stepometer to the circuit, because at first the ground and power was not connected properly, and I had to look through all the circuit wires to understand what I did wrong. The code was rather straight forward, as I just changed the code values to reflect the pin numbers on the circuitboard and changed the stepper seconds and rotation to match that of the stepometer.
However, when I was trying to reconnect stepometer to the power supply I pretty dumbly assumed that the IC chip would be the power supply, not the power line I had connected to the Arduino. So once I connected the stepometer to the IC chip, which was then connected to the Arduino, the Arduino immediately started to smoke, and even when I immediately yanked it out as fast as I could the Arduino was damaged and would not work for a while.
At first I wasn’t sure if it was my code that was giving me the error messages, so I asked a fellow to help, and it turns out when the Arduino started smoking it fried its ability to connect with my laptop. So at first I tried to fix the Arduino, and a fellow and I worked for a while to get it working, only to confirm the chip was fine, but the board was fried and had to be fixed at a later time. So I borrowed an Arduino board from the Equipment Room and checked the circuits, connected the right power supply, checked the code, and once I did that the circuit worked.
Here is a video below:
Unfortunately, completing the second circuit took up all the time in recitation, so going forward next time, to prevent something like this from happening, I would make sure to triple check with the fellows anything I am unsure about: such as where the powerline would go to, so then a mistake like this does not happen again. Especially with such high voltages, it is dangerous to make mistakes and carry on when you are unsure, and next time I will make sure to not only consult the circuit photo more closely, but also feel more comfortable reaching out to the the fellows and professors when I am unsure with what I am doing.
Documentation Questions
I would like to build a machine that could make food based on a certain type of emotion the user was feeling? I think that would be a really cool kind of digital art, especially if the food made was expressive in some way, probably in terms of colors or nostalgia factor when considering different recipes. The actuators used would be really interesting, because cooking requires precision yet deft movement, and I have yet to use any in class that were able to do both really well. So I would curious to see how programming an actuator to perform not only the finer details of cutting up food or measuring ingredients yet being able to like, swing a pan up and down or perform stir fry. The senses would mainly use smell, as I feel smell in combination with emotion would be an interesting combination of human responses that can be used to manipulate “art” as the food is meant to create a sense of nostalgia or is supposed to match a very specific emotion from the user.
What immediately caught my eye is Chico MacMurtrie’s The Drumming and Drawing Subhuman. Mostly because it looks really cool, but I think the idea of actuators being used to create human like features of struggle and emotion are really cool. It is similar to what I did in recitation because the skeleton looks like it is on fire and is melting (haha), but also I think like the motors I used in recitation, there is something very human like about the clunkiness of my motor along with his. An actuator’s movements can struggle enough that MacMurtrie used the effect to make organic drumming movements. The motions had to be, according to the artist, primal, struggling to the stand, attempting to hit a drum — the actuators have to be able to swing a large range but can not function fast or that well. The emotionality of the movements compared with the coldness of the skeletal robots evoke a powerful reaction from the viewer.