Final Project

The Shape of You

Patricia Troncoso Riveira,  Professor Margaret Minsky

CONCEPTION AND DESIGN:

The concept of the final project is a sculpture/body that represents human interactions. My partner and I wanted to start this project by improving on our midterm project. We were told in our midterm project that it was not interactive enough, so we wanted to improve on that. Originally, our final was going to be a sculpture with different sensors that, when activated, reacted in a certain way. We were going to add a bend, touch, and distance sensor. For example, if the touch sensor is activated, then the program would show something, but if the distance sensor was activated, then the program would show something different. When the idea was presented to our instructor, she raised reasonable concerns about the ergonomics of our sculpture and the bend sensor. As we started the building process, the structure changed into a mannequin, and we were not sure how to build the sculpture. When we changed the final into a mannequin with the sensors, the concept slowly changed to more of a representation of human feelings, especially concerning affection. User testing really helped with giving a purpose to the project, as all the users assumed that it was about how humans deal with affection. After user testing, we decided to fully focus on that ‘affection’ and programmed the responses to resonate with that feeling. After making those changes, the concept became more concrete, and therefore the project was better understood by the users.

FABRICATION AND PRODUCTION:

Initially, our idea was to make a sculpture and use touch, distance, and bend sensors. Each sensor would provoke a different response. There was no concrete design for the sculpture. Because each sensor was supposed to evoke different reactions, our professor was worried about how the bend sensor would work, because when you bend something it normally stays in that place, so the question was, what is the user supposed to do after bending the sculpture, bend it back? Another concern also had to do with the fact that there are not many bendable materials that would make a stable structure, and a lot of bendable materials tend to break after bending several times. 

When the fabrication process started, we ended up deciding on changing the sculpture to a mannequin, as it would save us a lot of time trying to figure out the materials and form of the sculpture. We also decided to forgo using bend sensors, as there were too many concerns about them, instead, we were to use a distance sensor and more touch sensors in different parts of the mannequin, simulating human touch. 

For the distance sensor, we decided to make a program that would simulate a heartbeat, the closer you are to the mannequin, the circles in the computer screen would compress and release. This acts as the feeling of when someone hugs you. The distance sensors did not give us any issues. We presented the project in the prototype presentation with the distance sensors and they worked well, so we only had to do a few tweaks in the code.  

The touch sensors, we were not sure how they worked, and the equipment room did not have many, and with the time constraints, we settled on using one touch sensor. We had to look through many tutorials to understand how the sensors worked. The touch sensor was a Grove IC2 touch sensor, that had four touch feelers. When we saw the feelers, we decided to give the mannequin arms and install the sensors in a hand to simulate touching hands and the feelings behind it. To utilize the sensors, we were inspired by a code from the website electropeak.com, where the code detects which of the touch feelers was being activated. After we modified the code and connected it to processing, we designed a code, that when activated would show hearts, implying nervous excitement when two people join hands. The problem was, that the sensors were not very sensitive, so sometimes the touch feelers did not get activated when being touched. 

Another problem that we had was when using 1 Arduino and connecting both the distance and touch sensors, none of the sensors would work, after many attempts, and eventually asking a professor, we figured out that each of the sensors needed 5V individually, so one Arduino connected to one computer, would not be enough power for both sensors. There were two solutions for this, we used a 5V power supply and merged the codes for both touch and distance sensors, or we used two Arduino and therefore two computers. We ended up using two monitors as the merging of the codes was giving many problems and there were time constraints. 

Another great issue we had during the fabrication process was that whilst we were doing the cabling and assembly of the sensors and the mannequin, there was a short circuit we were not aware of. While user testing, the Arduino would suddenly overheat, so the computer would not detect it and therefore not run the code. To figure out what the problem was, we had to disassemble the mannequin and go cable by cable to see if there was any contact or short circuit. We ended up discovering that it was, in fact, a short circuit in the touch sensor’s SLC, SDA, VCC, and GND area. In order to prevent this from happening again, we created dividers with plywood and then glued them so that they wouldn’t fall. After this, the sensors did not work, and we realized that it had been because we added too much hot glue. After getting rid of part of the glue the sensors finally worked.

CONCLUSIONS:

The goal for the final project was to create a sculpture that would create visuals depending on what sensors were activating, simulating human interactions, especially those involving romantic feelings. In the end, we were able to achieve our main goal. The audiences’ reactions during user testing exceed our expectations in comparison to our midterm project. My definition of interaction is that both the object and user have to reciprocate interaction, and our project did, as the user need to interact with the project so that the sensors could activate, and the sensors needed to be activated by the user to produce the visual. If we had more time, we could have successfully merged the codes and maybe use a large monitor to better display the visuals. We could have also added more sensors, such as vibration and bend sensors, to diversify the user sculpture interactions and explore a wider range of feelings/emotions. Another thing we could have done is do representations with more than processing visuals, for example, sounds and neopixel, as this would have elevated our project. 

We have learned a lot throughout this project, especially patience and thinking outside the box. We have had several problems with the project and had to be very creative with our solutions. After research and guidance from the professors, we would test many methods to solve problems that we normally wouldn’t think of. When the short circuit happened with the sensors, we initially thought it was a programming problem and that our Arduinos were getting fried due to the code, but after asking for help, the possibility of it being a simple short circuit or contact between two cables was opened. We ended up having to manually wrap all copper ends and make sure no cables were touching. This was a very arduous task that took time, but it made us learn that not all problems derive from the programming aspect, but that a simple contact between two cables can be the detonating factor. This final project was very hard and patience-testing, but I am very thankful as I have learned new skills that will help me greatly in the future. 

 APPENDIX

Final project
Cabling of final project.
Working on touch sensors.
Fixing the touch sensors.

User testing of final project.

 

Touch Sensors:

Diagram of the Arduino circuit connections.

Arduino: 

#include "SerialRecord.h"

SerialRecord writer(1);

#include <Wire.h> // include I2C library

#include <i2c_touch_sensor.h>

#include <MPR121.h>

// include our Grove I2C touch sensor library

// initialize the Grove I2C touch sensor

i2ctouchsensor touchsensor; // keep track of 4 pads' states

//boolean padTouched[4];

long previousMillis = 0;

long interval = 100;

void setup() {

Serial.begin(9600); // for debugging

Serial.print("begin to init");

Wire.begin(); // needed by the GroveMultiTouch lib

touchsensor.initialize(); // initialize the feelers // initialize the containers

}

void loop() {

unsignedchar MPR_Query = 0;

unsignedlong currentMillis = millis();

if(currentMillis - previousMillis > interval){

previousMillis = currentMillis;

touchsensor.getTouchState();

}

for(int i = 0; i < 12; i++){

if(touchsensor.touched){

writer[0] = 1;

}

else{

writer[0] = 0;

}

delay(200);

writer.send();

}

}

Processing: 

import processing.serial.*;
import osteele.processing.SerialRecord.*;
 
Serial serialPort;
SerialRecord serialRecord;
int touch1, touch2;

int prevVal = 0;

void setup() {
  fullScreen();
  background(0);
  smooth();

  // Find and connect to the Arduino board
  String serialPortName = SerialUtils.findArduinoPort();
  serialPort = new Serial(this, serialPortName, 9600);
  serialRecord = new SerialRecord(this, serialPort, 1);
}

void draw() {
  // Read data from Arduino
  serialRecord.read();

  int val0 = serialRecord.values[0];

  if (val0 == 1) {
      float a = random (width);
      float b = random (height);
      float r = random (100, 255);
      smooth();
      noStroke();
      fill(r, 0, 0);
      beginShape();
      vertex(a+50, b+15);
      bezierVertex(a+50, b-5, a+90, b+5, a+50, b+40);
      vertex(a+50, b+15);
      bezierVertex(a+50, b-5, a+10, b+5, a+50, b+40);
      endShape();
  } 
  else {
    background(0);
  }
}

Distance Sensors:

Diagram of the Arduino circuit connections. Done with tinkercad.com

 

Arduino:

#include "SerialRecord.h"

#include <FastLED.h>

#include <NewPing.h>

#define TRIG_PIN 3

#define ECHO_PIN 4

#define MAX_DISTANCE 200

NewPing sonar1(TRIG_PIN, ECHO_PIN, MAX_DISTANCE);

SerialRecord writer(1);

int val0;

void setup() {

  Serial.begin(9600);

}

void loop() {

  sensor();

  val0 = sonar1.ping_cm();

  writer[0] = val0;

  writer.send();

  delay(10);

}

void sensor(){

  delay(50);               // Wait 50ms between pings (about 20 pings/sec). 29ms should be the shortest delay between pings.

  Serial.print("Ping1: ");

  Serial.print(sonar1.ping_cm()); // Send ping, get distance in cm and print result (0 = outside set distance range)

  Serial.println("cm");

}

Processing:

//libraries
import processing.serial.*;
import processing.sound.*;
import osteele.processing.SerialRecord.*;

//Arduino stuff
Serial serialPort;
SerialRecord serialRecord;
SoundFile sound1;

//yuuh
float firstTerm = 1;
float secondTerm = 1;

//shit ton of ints
int x = 0;
int b = 0;

void setup() {
  fullScreen();
  background(0);

  //Arduino
  String serialPortName = SerialUtils.findArduinoPort();
  serialPort = new Serial(this, serialPortName, 9600);
  serialRecord = new SerialRecord(this, serialPort, 1);
}

void draw() {
  serialRecord.read();
  int value0 = serialRecord.values[0];

  if (value0 <= 10) {
    drawBalls();
  }
 
}

void drawBalls() {
  for (int i = 1; i < width; i = i + 20) {
    for (int j = 1; j < height; j = j + 20) {
      fill(0);
      noStroke();
      ellipse(i, j, firstTerm * 1.5, firstTerm * 1.5);
      fill(random(260, 280), 90, 50);
      stroke(0);
      strokeWeight(random(1, 5));
      ellipse(i, j, firstTerm, firstTerm);
    }
  }
 
  float nextTerm = firstTerm * 1.5;
  firstTerm = secondTerm;
  secondTerm= nextTerm;
  if (firstTerm >= 20 || secondTerm >= 20) {
    firstTerm = 1;
    secondTerm = 1;
  }
}


void drawBack() {
  background(0);
}

POTENTIAL IDEAS:

We wanted to add more elements but due to time constraints and mechanical problems we were not able to, we had made the designs but were not able to execute them:

3D PRINTED HEART: We wanted to 3D print a heart to put it where the distance sensors are. We would put to holes in it, the sizes of the sensors so that it would still work.

Diagram of the 3D printed heart. Done with tinkercad.com

LASER CUT BOX: Our initial idea was to laser-cut a box to hide the cabling. It was supposed to fit like a backpack. The laser cutter was not working at the time and therefore did not execute the plan. The box would have had the top side of it open so that we could move cables if needed.

Diagram of the box. Done with cuttle.xyz

 

This is the website where we got the inspiration for our touch sensor code: 

https://electropeak.com/learn/interfacing-grove-12-key-multi-touch-sensor-with-arduino/ 

Midterm

RACHATA! – PATRICIA TRONCOSO RIVEIRA – MARGARET MINSKY

CONTEXT AND SIGNIFICANCE

The previous group project does not really impact or inspire my midterm project as we had a rough idea of what we wanted to do before the group project. My Group project was based on a fictitious story, so it does not really correlate. The artifact we made did not really inspire me, but the research did give me an understanding of interaction, an object that interacts with a person and vice versa. Our project came up after learning about servo motors, we thought of the idea for a laugh, but continued to add more and more elements to the idea of a robot dancing. After listening to some bachata music, we decided to make our robot dance to this genre. I do not believe that our project is that unique, it is a robot that dances. The intentions behind the project are my and my partner’s different backgrounds and how some cultural aspects such as bachata connect us. This robot was made as an homage to our cultural connection despite our different ethnicities and also for entertainment.

CONCEPTION AND DESIGN

At the beginning of the midterm project, we were not sure how to make it interactive, we had some ideas, but we had not settled on anything.  We knew that we wanted to make our robot have legs, and we settled on using a servo motor. We used four servo motors; two for the feet and two for the thighs. We also made a body for the robot using Arduino Uno and cardboard. We used cardboard because it is a very accessible material and strong enough for what we needed it to do. We glued the thigh servo motors to the cardboard base and for the feet servos, we glued them to the cardboard ‘shoes’, we had made. The wires were then attached to the Arduino that was glued to the body. We realized that we needed to have several GND and 5V, so we added a proto shield on top of the Arduino that would provide us with the necessary components. We had the most problems with balance, for the majority of the project duration, the robot could not stand on its own, and the fact that it was connected to the computer with the cable, worsened its balance. We bought weights and tested out several places to glue them in order to create counterweights, which successfully worked the majority of the time. In order to make it interactive, we ended up adding an LED that would turn on and off every 10 seconds in order to tell the user when it was their turn to dance with the robot.

Schematic view of the Arduino connections. Done with tinkercad.com
Diagram of the Arduino circuit connections. Done with tinkercad.com

FABRICATION AND PRODUCTION

The most significant and difficult part of the project was manipulating the servos to work the way we wanted them to work. The code is quite straightforward, but the servos would act differently than we expected. We had to figure out which angle we wanted to turn the servos, and we also had to consider that the thigh servos were glued to the feet servos, so we had to make sure that the positioning did not disrupt the balance. It consisted of playing around with the angle and positioning of the servos. 

By the time it was user testing, the robot was not really interactive, as the most it could do was a few stomps and slides. We had been so fixated on making sure the servos we doing what we wanted them to do, that we did not have time to think about the interactivity. Despite that, the reviews were quite positive. People thought that our project was very entertaining and cute, and they also gave us feedback and ideas on making it interactive. In fact, the interactivity factor of our project was suggested y one of the testers. We decided on the LED that would tell the user when to dance. 

After user testing, we focused on refining the positioning of the servos and on the LED. Initially, we were able to make the robot dance in one direction, but after asking for help, we made it dance in the desired way. One of the ways we tried to make the robot dance in the other direction, was by creating a separate loop in which we reversed the code, but that resulted in the robot standing on its toes, like ‘tiptoeing’. Once we had made it dance like we wanted it to, we focused on the LED. We stuck a mini breadboard into the proto shield and connected an LED and a resistor, and then connected it to the Arduino.

Here are some photos and videos of the building process:

CONCLUSIONS

The goal for this project was to create a robot that would dance bachata with the user. The project results do align with the definition of interaction because the robot does interact with the user. The part in which the robot does not align with the definition of interaction is in the fact that as of now, the user cannot really interact with the robot in the sense that the user cannot have direct interaction with the project. The user can dance with the robot, but it cannot change the way it works/dances unless they directly change the code. The interactions with the robot were overall positive, some people understood the point of the robot, and others needed some explanation of what to do with it. If we had more time, we would probably work on giving the robot more dance moves, and interactivity of the robot. We could make some kind of monitor that informs the user better, we could also make it use batteries instead of connecting it to the computer, etc…  I have learned that many things can go wrong and that a little bit of help can do wonders. I am really proud of what my partner and I made.

User Testing:

ANNEX

Final code used:

 #include 

Servo rightfoot;
Servo rightthigh;
Servo leftfoot;
Servo leftthigh;

int pos;
long prevMillis;

void setup() {
  pinMode(13, OUTPUT);
  rightfoot.attach(9);
  rightthigh.attach(5);
  leftfoot.attach(3);
  leftthigh.attach(11);
  leftfoot.write(10);
  leftthigh.write(90);
  rightthigh.write(100);
  rightfoot.write(180);
  Serial.begin(9600);
}

//rightfoot 100
//leftfoot 70

void loop() {
#include 


Servo rightfoot;
Servo rightthigh;
Servo leftfoot;
Servo leftthigh;


int pos;
long prevMillis;


void setup() {
 pinMode(13, OUTPUT);
 rightfoot.attach(9);
 rightthigh.attach(5);
 leftfoot.attach(3);
 leftthigh.attach(11);
 leftfoot.write(10);
 leftthigh.write(90);
 rightthigh.write(100);
 rightfoot.write(180);
 Serial.begin(9600);
}


//rightfoot 100
//leftfoot 70


void loop() {
 Serial.println(prevMillis);
 if (millis()  prevMillis + 10000 && millis()  prevMillis + 20000) {
   Serial.println("millis");
   prevMillis = millis();
 }
}
 

 

Group research project

The artifact we prototyped consists of an artifact that records and replays your dreams. The idea of my group’s project focused on the idea I presented for Ursula K. Le Guin’s “The Ones Who Walk Away from Omelas”. The project is consistent with the established form of interaction. I had previously defined interaction as “different objects, which influence and modify each other, taking into account the situation and the surrounding circumstances, making interaction a subjective experience.” The two objects, in this case, the human and the artifact can be seen to influence and modify each other, since at the end of the day what the artifact is broadcasting is that individual’s dreams. This mutual interaction makes the experience subjective since the experiences that a person will have with the artifact will change depending on many factors since dreams are not the same. It also customizes the experience as everyone dreams differently.

In my opinion, the artifact accomplishes the function as determined by the group. From an aesthetical point of view, you can’t see much that characterizes it and you can’t demonstrate what it does since it is a fictitious invention. We decided to make the artifact based on my idea because after presenting all our ideas we determined that my concept would be the most feasible and interesting to plan within the time we had available. The chosen idea also gave us a chance to make an interesting and easy-to-understand performance for the viewer.

In terms of group work my contribution to the project has been immense as I was in charge of bringing the whole group together and the final idea came from my Read phase.

The group work process started when I sent an email and we organized a WeChat group so we could discuss ideas. We met twice: the first time was to discuss the project itself and plan what artifact we were going to make while the second time was to build the artifact and plan the performance we were going to give. Certain members of the group were in charge of building the artifact while the other half of us wrote and planned the performance.

For this part, I will analyze the performance given by Group 6. Their artifact consisted of a helmet that when placed on an individual’s head, would remove the feelings of that person wearing the helmet. This clearly comes out of the story of “The Plague”. The function of the idea is to remove the person’s emotions. I think it fits the guidelines of the project as it is an idea that does not exist and is derived from the stories we had to read during the “Read” phase of this project. The artifact seems to be well-designed.

The performance helped a great deal in understanding the purpose of this artifact, they were able to demonstrate the function of the artifact and the context very easily. Something that I would say that they should improve is that physically the appearance of the helmet does not correspond with their description of the idea as it gave the impression that it had more uses than it really had.

Script (click here): Our group did go improvised a bit and went off script as originally the plan had been for only two people to put on the helmet but during the performance, it ended up being too short.