For this week’s recitation, the basics of Processing were explored through the creation of a designed based on another artists’ previous artwork, for which I had the initial idea of creating a piece very abstract in colors and shapes.
The image I selected is called AII (Construction AII), and was created by, László Moholy-Nagy in 1924, with the intention of creating an architectural design in space, seemingly as if it were constructed by a mathematical formula. I chose this artwork with the purpose of expanding the concept of intersectional shapes, into a rather 3D plane, by utilizing the same shapes, but with a lot more variations and overlapping of different opacities in order to create a shadow-like effect within the artwork. Nevertheless, the idea in my thoughts was easier than when I actually initiated inputting it in Processing, as the code was very challenging for me at first.
Original Image
The mayor challenge for me was locating myself and the drawing within the coordinates and scale of the graph. As I did not use simple shapes such as “rect” and in fact decided to draw “quad” due to the nature of the original image, I had difficulties identifying which corner corresponded to which coordinate. In fact, it took me nearly half of the recitation time to draw with the correct coordinates the center black polygon. Nevertheless, once I was able to place objects within the frame I had, the rest of the items were much easier to create. Colors and opacity on the other hand, wasn’t as complicated as I imagined before, except for the aspect of having to find the correct tone of brown for the background. After various attempts with the (RGB) color distinction, I just decided to use the color selector tool within processing for the efficiency.
My Drawing
My final creation aligned with the original artists’ motif, as I also developed it around the idea of mathematical accuracy within the sizes of the geometric shapes in order to assemble the architectural-style piece. However, it differentiated with it, in the way that my creation attempted to encompass a deeper view into not only 2-Dimensional measurements, but also 3-Dimensional depth within the figures through shadows. Considering the nature of my idea, I consider that processing was indeed the best way to achieve my goal, as the input of x,y-coordinates allowed my measurements to be more precise, and the geometric figures to resemble an architectural stance.
iCups: Connecting people through the smallest memories
Context and Significance
Long distance communication has been an aspect of human life that technology has tried to overcome, or at least improve through the creation of wireless devices such as phones and smart watches, that allow two individuals regardless of their location to contact each other. Nevertheless, after taking into account these significant advances, we have put upon consideration that one aspect lacking from this communication, is the deeper experience and feeling of true connection with the other person, through more than just speech and visual methods, instead by means of memories.
Such experience has been developed before through products such as “Pillow Talk”, which was created by a UK design shop called Little Riot, in order to ameliorate the hardship of long distance love relationships, by enabling communication through glowing pillows. This product works in such a way that the pillows are internet-enabled, therefore, when one person is sleeping, it glows in response and sends a signal over the internet so the other one can light up as well. (Love, 2013) This process allows couples to virtually feel each other’s presence miles away, and wirelessly connect with one another while they sleep (Kelly, 2011)
It is this concept of virtual presence that has been further explored and developed through the creation of “iCups”; as its significance goes beyond superficial talk, to deeper moment connect with our loved ones. With these interactive cups, two individuals will be able to feel each other’s presence through the simplest of everyday memories such as drinking coffee. This product is intended for an audience that maintains long distance relationships with their relatives, and it gives the special value of making them feel close to each other. For example, if it is the case of a daughter who went to college, and she misses her mother as they used to drink coffee every afternoon together, with iCups, they will be able to feel interconnected and reminded of the memories they had, despite of the distance.
Conception and Design
“Today, design has gone far beyond its simple origins as a craft to develop powerful new ways for people to interact with the world, emphasizing experience, not technology.” This conception of design making developed by jnd.com, has been applied to our product, as it encompasses the digitalization of everyday items, to make them into an interactive experience of feeling a loved ones’ presence through the pleasant moments of drinking coffee during the day. Regarding the prototype itself, many had to be created before finalizing on one. This was due to the fact that elements such as OLED displays and buttons had to be added within the cup so that there existed a response between the products along with the temperature sensor. The original idea was to create a physical mug through which you could drink coffee directly from. Nevertheless, in the end, for practical measures, the product that was created was a 3D printed filament cup holder, in the shape of a mug, that contained within a stainless steel cup that captured heat. The reason why we decided to use a stainless steel cup to contain the liquid, and insert it within our mug holder prototype, was so that the extreme temperatures in the liquid inside could actually be detected by the temperature sensor that would be placed in the bottom part of the cup holder. Another option for the product prototype could’ve been utilizing a different material such as cardboard for the holder. However, we rejected it as it would’ve been too unstable for the wires, display, buttons, and even the cup itself. Also, if it were the case that the holder got wet, it would’ve completely ruined a cardboard or weak filament prototype.
Fabrication and Production
Fabrication and production of the prototype was harder than expected. The most significant steps during the production process I would consider would be: first, grounding the idea to a specific aim; second, developing the code for the program to work how we want it to; and third, creating the prototype based on what best is directed towards our goals for the project. Regarding the physical fabrication, other than taking into consideration the elements that had to be included into the cup, comfort was another aspect that had to be addressed. After the user testing, the prototype production was modified in such a way that the product actually looked like a cup, instead of an object with strange appearance and openings for buttons. As during the user testing, many people who tried using the object, were unsure how to as there was no handle and just a cup with a black screen and wires. Likewise, the fabrication of the code for the interaction to work, was quite a challenge as well. As even though initially both displays worked perfectly independently, when trying to get them to intercommunicate while using delays within the circuit, they started to be faulty. The problems faced ranged from displays not turning on, or mixing up the letters that had to be shown, to them not sensing the input, and outputting the wrong information. In the end, we managed to achieve that both cups interacted with one another somewhat how we wanted them to, as even though they did display text when sensed upon temperature, and the buttons were able to send a response; the system was faulty in that it took too long for both displays to process the information, as the code contained too many delays within it. Regardless of the technical difficulties we had with the adaptations made to the code, in terms of the changes and evolution of the prototype of the cup holder design itself, it did become more successful as it was more comfortable for the users, and gave them a clearer idea of the usage of the product.
After considering the significant advances in communication technology that have been evident in the emergence of cellular phones such as iPhone, which serve the purpose of connecting people with their relatives even if far away in the distance. Our project aligns directly with not only this concept of new communication technology that connects people through memories, but also evidently supports our definition of interaction; as there are responses to every action. Ever since individual1 fills cup 1, which triggers a temperature sensor that sends a message to cup 2, consequently allowing individual2 to know what the other person is doing; that is already interaction between people and digital sensors that activate screens. Furthermore, when individual2, is given the ability to respond back to individual1’s message through the display by pressing a button; that furthers the interaction between both objects as it is a way of assuring that the project effectively works, and aligns with the action/reaction relationship within interactivity.
Based on how the final outcome came to place, our audience interacted in a positive way with the idea of the project, as they agreed on the fascinating concept of a new way of feeling a loved ones’ presence through interaction that goes beyond a video or phone call, instead it is felt and thought of by the participants. Nevertheless, if I had more time I would improve my project in such a way that it expanded the scale of the display, so it was a more pleasant experience for the user. Also, I would research more regarding the code involved, so that there could exist less delays in order for the message to achieve to the other cup faster. As well as, I would attempt to install a 5G program which could make the cups in reality work over the internet and wireless communication. Nevertheless, from the setbacks I have been able to realize the real problems that occur when creating a product, however alongside with these failures come the accomplishments, from which I’ve been able to learn that any idea in mind is possible to achieve by continuing effort, research, and consulting help from experts.
This project for me, meant much more than just creating a prototype from what we learned in class of Arduino. Through the creation of iCups, the real issue that has been wanting to be addressed is the importance of human connection. With this product, my partner and I were attempting not only to join people through the smallest memories, but also to make them realize how it is not the “phone call” or “video call” that matters the most many times, as those are only given once in a while when both parties have time. Instead, the true meaning behind connecting with a person, is feeling its presence, and knowing that despite any distance or time difference, they will always be with you, even if it’s just through a cup of coffee.
In this week’s recitation, we explored the concept and use of actuators in building machines that produce movement through the Arduino and higher power voltages. We did this by using a H-Bridge to build a drawing machine powered by a stepper motor.
Materials: The list of materials has been taken from the IMA recitation website as a reference for the components mentioned in the rest of the documentation.
For Steps 1 and 2
1 * 42STH33-0404AC stepper motor
1 * L293D ic chip
1 * power jack
1 * 12 VDC power supply
1 * Arduino kit and its contents
For Step 3
2 * Laser-cut short arms
2 * Laser-cut long arms
1* Laser-cut motor holder
2 * 3D printed motor coupling
5 * Paper Fasteners
1 * Pen that fits the laser-cut mechanisms
Paper
Step 1:
Initially, I was afraid of how difficult the task might be, and whether I would damage my computer or not due to the voltage changes. As in the past recitations we’ve only used 5V, however this time due to the nature of the machine, a 12V power was necessary. Nevertheless, the circuit building was quite clear in terms of the connection to the H-Bridge. This component was necessary in order for the DC (Direct current) stepper motor to be able to run forwards or backwards, in other words, it was used to switch the polarity of voltage applied to the circuit. I decided to color code the wires so that I had less difficulties identifying the cables to know they were in the right place, and making sure I did not cross the voltages with each other to prevent computer and circuit damages. In the end, after considering these aspects and running the code, it worked perfectly.
Step 2:
While completing step 2 of adding a potentiometer in the circuit in order to control the rotation. I initially found it challenging as the mapping function was still unclear to me, and in order for the motor moving via input from the potentiometer, it needed to be programmed with analogRead. However, after overcoming the challenge of mastering mapping to the minimum and maximum values of both analog and the stepper motor; I was able to match the movement of the knob with the rotation of the motor, so that way it was ready to spin the laser cut pieces of the drawing machine.
The final step of actually assembling the machine was more challenging than originally thought of, as despite having all of the pieces together, it was difficult to control the motor with the potentiometer because it would sometimes rotate with such power that the motor slipped out of the laser-cut motor holder, and the paper fastener couldn’t always hold in place the laser-cut arms. However, even though my partner and I weren’t able to fully control it, we still managed to create a masterpiece with our drawing machine.
The kind of machines I am mostly interested in building are those that give a deeper use to technology and interaction. The kind that make everyone’s lives easier, more comfortable, and accessible. One example of an idea I’ve always had in mind is the creation of smart glasses that transcribe a person’s words into text, and could help individuals with hearing disabilities understand what others are saying, even if there’s no other way of translation through actions such as sign language. Within this type of portable machine, I would use actuators as the ‘mover’ of energy that in fact makes possible this translation. That way, those who wear the glasses aren’t overwhelmed with long texts of what people say to them each day, instead have control over the sound energy that the actuator in the glasses would be received and translating into text.
This type of object may seem different to those machines that are used to move items or in the digital manipulation of art, as it doesn’t create as much as a visual kinetic movement for an audience. Nevertheless, it serves a purpose and still demands a creative process into its fabrication.
Question 2:
The installation that captured my attention in the reading was David Palacios’ artwork called Waves (2006). I found it captivating because it does not only involve the movement of a rope, but also the movement of people. As the sine waves with motion aren’t activated until there is a stimuli from the movement of viewers. Therefore, in some way it could be said that the audience had a certain control over the oscillations even if they did not acknowledge it. This was similar to the work I did during the recitation, as regardless of the stepper motor being able to move the rest of the drawing machine on its own, it was my partner and me, along with the use of potentiometers, who truly had some control over what was being illustrated with the machine as a whole. I consider that the artist selected those specific actuators for his project, with the idea of giving the viewers a much greater kinetic experience appreciating waves that surround society in more ways than we think. In fact, the presentation of this motor through this rope, creates an audiovisual illusion that allows us to acknowledge waves much more profoundly than those we see in the ocean, or the sine graph of a function. Instead the art installation allows us to perceive them with a sense of life; as if they are moving with us while we are moving with them.
After having read “The Art of Interactive Design”, and analyzing it based on my own experience with creating an interactive device, I have been able to define interactivity as a dialogue between two agents in which there is a action-response relationship. (Crawford 4) This interchange usually occurs between a human being and a computer program, nevertheless, any situation in which there is a reaction to an initial action could be considered as interactive to my perspective.
One of the projects that I encountered in my research which aligned with my definition of interaction, was in fact the work of an entire company called vertigo systems, which specializes in creating digital illusions using 3D virtual reality technology. Their product is called “Charlie’s Playground”, and it includes a combination of interactive games that can be projected or installed on the floor or on ceilings as entertaining activities in waiting areas or indoor playgrounds. This idea clearly aligned with my definition in the sense of a project that involved human interaction with technology. Nevertheless, while appreciating how humans associated with it for entertainment reasons, the first thought that came to my mind was to give this idea a new purpose that tended more towards home child safety, rather than just leisure.
On the other hand, a project I encountered which did not completely align with my definition, was a case study of “Custom Building Projections”. This project developed by a company called Lumo Interactive, created 3D objects or specific architectural features through their projections on buildings. These digital displays are typically used for art exhibitions, holiday events, and performances. Nevertheless, regardless of being considered interactive, this project does not align with my perception of interactivity, as in order for the projection to occur, there is no necessary interchange between two agents, as it is mostly a display. Therefore the action-reaction relationship mentioned in my definition is not taken into account for this particular idea.
It is along these two ideals of interactivity/non interactivity, that while gathering ideas to make the project itself, we wanted to create an object that didn’t only show interaction between two subjects, but also enhanced security and wellbeing in a futuristic scenario that would be 2119. That is how after unifying our research and projects, we came to the final plan of creating “Smart Floors”, an interactive platform that upon contact from a human being, detects the participant and processes the inputs of height, weight and other variables. Then, with virtual reality technology, it produces an output that affects the subject physically and psychologically, by introducing it into an environment that will make them remain within the area where the ‘smart floor’ is located; depending on the situation. For example, if it is a baby that approaches the floor, Smart Floors detects who it is and creates a distracting, caring and tranquil environment to prevent the infant from going into unsafe areas of the house while its parents are occupied.
This project encompassed completely my definition of interaction, as there is an action-response relationship, since it includes one specific object which gives a virtual reality response to another subject upon its contact with it. Also, it relates with both example projects researched as it embraces the idea of virtual reality from “Charlie’s Playground”, while also not leaving behind the digital display of the second project. While simultaneously, serving a greater purpose than entertaining, which is: ensuring home security through interactivity.
Works Cited:
Crawford, “What Exactly is Interactivity,” The Art of Interactive Design, pp. 1-5.
In this week’s recitation, we were assigned to build circuit with sensors, and code their programming into the arduino board. In my opinion, building the circuits was not quite as complex for some of the sensors, however each one had its own specific code that we had to programme in order for it to actually work, so that was the more tricky part for which we used many sample codes at a certain point.
Circuit: Infrared Distance Sensor
>>Materials:
1 Arduino Board
1 Infrared Distance Sensor
1 Bread Board
Jumper Cables
>>Sample Code:
/******** start code ********/
//connect gp2d120x to A1
#define pin A1
void setup () {
Serial.begin (9600);
pinMode(pin, INPUT);
}
void loop () {
uint16_t value = analogRead (pin);
double distance = get_IR (value); //Convert the analog voltage to the distance
Serial.println (value); //Print the data to the arduino serial monitor
Serial.print (distance);
Serial.println (" cm");
Serial.println ();
delay (500); //Delay 0.5s
}
//return distance (cm)
double get_IR (uint16_t value) {
if (value < 16) value = 16;
return 2076.0 / (value - 11.0);
}
/******** end code ********/
My partner and I created this circuit by connecting the sensor to an analog input A1, power 5V, and to ground. After completing the circuit, we used the AnalogInput feature in the Arduino programming, to insert the sample Infrared Distance Sensor code into the program.
Question 1:
In this recitation exercise, we intended to assemble an Infrared Distance Sensor, which measured how close or far an object was from it. While building it and seeing its purpose, I thought about its possible pragmatic use in security related situations, where alarms go off if a person gets too close to an object. For example, in high quality jewelry stores, they may implement that sensor within the box of displayed diamonds, so that if anyone were trying to steal it, just by touching the crystal box, it would automatically trigger the alarm as the hand’s distance is to close. Consequently, using it because it’s a great method to measure and keep people away from precious objects.
Question 2:
It is very true that code is quite similar to following a recipe or tutorial, as contains the ingredients along with its steps that are needed for a sensor or anything connected, in this specific case of the arduino board, to actually work. This idea of code as “cooking”, was also introduced through the video showed in class, where a computer scientist made a cooking tutorial, and explained how it related to coding. Hence demonstrating how coding is a way of giving instructions to the computer so it actually works. However, another way in which they relate, isn’t only by its instructions, but also within the specifications both actions have within them. For example, when cooking, you usually have to add a certain amount of each ingredient for the final product to be delicious. The same occurs in coding, you need to write it with a certain order of instructions, capitalized letters vs. uncapitalized letters, and exact pin numbers. If a code has something out of place, or is missing a fundamental letter, then while trying to run it, it won’t work. Therefore, showing how coding is similar to following a recipe, as recipes are specific, and they need to be followed in a certain way for it to work in the most optimal manner.
Question 3:
Manovich describes the influence of computers on new media as a starting point for it. He explains how new media hasn’t just been shaped by computers, but also created, distributed and stored within them, hence showing the inextricable relationship they have. A similar concept can be seen through humans, as new patterns of social behaviors have started to rise due to computer influence. Each day with new technological advancements, human life becomes assisted by computers. One example is how people are less likely to go to buy food in person, or go to stores, since they have an “app” that does it for them. Even though this facilitates human lives, it tends to create more lazy behaviors, and shapes our conduct in a way that less in-person interaction is being held, consequently damaging social interactions because of “computer presence”.