Recitation 4: Drawing Machines

Step 1: 

Step 2:

#include <Stepper.h>

// change this to the number of steps on your motor
#define STEPS 200

// create an instance of the stepper class, specifying
// the number of steps of the motor and the pins it's
// attached to
Stepper stepper(STEPS, 8, 9, 10, 11);

// the previous reading from the analog input
int previous = 0;

void setup() {
  // set the speed of the motor to 30 RPMs
  stepper.setSpeed(30);
}

void loop() {
  // get the sensor value
  int val = analogRead(0);
  val = map(val, 0, 1023, 0, 127);
  // move a number of steps equal to the change in the
  // sensor reading
  stepper.step(val - previous);

  // remember the previous value of the sensor
  previous = val;
}

Step 3:

 

Question 1: What kind of machines would you be interested in building? Add a reflection about the use of actuators, the digital manipulation of art, and the creative process to your blog post.

Based on the drawing machine, I would like to build a curving machine. It means we only need to use the resistor to do very small changes and then the two motors will make the arms attached curve a much larger object. I may use step motors, longer arms, nails that can attach arms together and a laser cutter or a specially made graver.

Question 2: Choose an art installation mentioned in the reading ART + Science NOW, Stephen Wilson (Kinetics chapter). Post your thoughts about it and make a comparison with the work you did during this recitation. How do you think that the artist selected those specific actuators for his project?

I’m very interested in Feral Robot Dogs designed by Natalie Jeremijenko. This robot dog is environmentally friendly and it is like a detective that can sense pollution then warn people by jumping and barking. This invention and me and my partner’s drawing machine are both interactive. We changed vary the status of our variable resistor to input and the Arduino processed the data to map with the step motor, which was an output. For the robot dog, its chemical sensor can sense special elements in the environment. Then it processes and makes its motor spin to let the dog move and make sound. These two whole processes are a bit different from each other but their outputs are the same. The artist needs some special sensors like chemical sensor, smoke sensor, humidity sensor, etc. 

Final Blog for the Group Project

Summary of Our Group Project

Our group invented an artifact that can operate like a companion of human. It can grow as time passes by like a living creature on earth.

Before we nailed the invention we wanted to perform, we had a group meeting on Wechat. After everyone explained their research blog post, we picked Vivianne’s robot idea, my idea of virtual figures giving psychological care to human beings and Hazel’s idea about a growing mechanical pet. We combined our ideas together and made it into this artifact.

Firstly, it is in a form of embryo, and we use a light bulb to present the primary period of its life. Then comes it’s teenage years, like a small robot. We used a puppet to perform. Lastly, it becomes a giant robot, as large as a normal human being. It grows like human with its owner, mostly acts like human, even feels like a human at last.

And this last part is a bit similar to most of the fiction movies. Someday in the future, when AI is highly developed, human are not able to control robots in the way they want anymore. So we let the robot’s inner space transfer from taking care of its owner, because it is coded to have this function, to having its own emotion, such as happiness, anger and sadness.

In my research phase, I defined interaction as a process of input, analysis and output and I think we showed interaction efficiently. When Hazel took the light bulb home, she spoke with it and the light bulb asked her to give it a name. The conversation between them was interactive. Hazel provided information and then it analyzed the words to give a response. Later in our performance, when Hazel got hurt and turned old, it offered physical help and comforted her with words. So the process of input, thinking and giving feedback is interactive.

I think we fulfilled the tasks very well. We didn’t use any electronic devices to express and combined our makeup story with the artifact. What we need to improve is that we could offer some monolog by the robot to offer more imformation about its gratual mental change.

 

Reflection on Other Group’s Performance

Group 6 gave us a creative and interesting act. They performed a brain information transformer that can transfer memories to a robot, but the robot is just a memory storing object that cannot feel and act like human. I don’t see any connection with the fictional stories but it’s still a good artifact. It shows interaction between human and robot and the memory transformer is the bridge connecting them. I’m very fond of the idea, since human always have that “living forever” fantasy, and this interactive device is a scientific realization of it. It was easy to understand when human and the robot did the same thing in the same scene, while result was different. They made a comparison in order to show the absence of robot’s feeling. On a personal level,  I consider it would be better if they show more interaction. What will happen when the robot talks to a human or a robot like itself?

Recitation 3: Sensors

Answers to the Questions

Question 1: What did you intend to assemble in the recitation exercise? If your sensor/actuator combination were to be used for pragmatic purposes, who would use it, why would they use it, and how could it be used?

Joystick Module:

 

I intended to assemble an interactive device, including the Joystick Module, that can reflect data of x-axis and y-axis to locate the stick as I move it. With a pragmatic purpose, if the Joystick module is connected to an electric wheelchair, it can be used to control the wheelchair. The data of the stick’s movement will be transferred to the center and then will be used to turn the wheels. Disabled people or elders can easily use the stick to adjust the wheel to move without rolling the wheels with a lot of strength. 

 
void setup()
{
    Serial.begin(9600);
}
 
void loop()
{
    int sensorValue1 = analogRead(A0);
    int sensorValue2 = analogRead(A1);
 
    Serial.print("The X and Y coordinate is:");
    Serial.print(sensorValue1, DEC);
    Serial.print(",");
    Serial.println(sensorValue2, DEC);
    Serial.println(" ");
    delay(200);
}

 
 
 Ultrasonic Ranger

I hoped I could build a circuit that can show the distance between an object and this sensor. In real life, I think it can be set on the street and then be used to measure the speed of a vehicle to secure traffic safety. Since the ultrasonic range can show the distance so it’s easy to get the velocity if we know the time.

// this constant won't change. It's the pin number of the sensor's output:
const int pingPin = 7;

void setup() {
  // initialize serial communication:
  Serial.begin(9600);
}

void loop() {
  // establish variables for duration of the ping, and the distance result
  // in inches and centimeters:
  long duration, inches, cm;

  // The PING))) is triggered by a HIGH pulse of 2 or more microseconds.
  // Give a short LOW pulse beforehand to ensure a clean HIGH pulse:
  pinMode(pingPin, OUTPUT);
  digitalWrite(pingPin, LOW);
  delayMicroseconds(2);
  digitalWrite(pingPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(pingPin, LOW);

  // The same pin is used to read the signal from the PING))): a HIGH pulse
  // whose duration is the time (in microseconds) from the sending of the ping
  // to the reception of its echo off of an object.
  pinMode(pingPin, INPUT);
  duration = pulseIn(pingPin, HIGH);

  // convert the time into a distance
  inches = microsecondsToInches(duration);
  cm = microsecondsToCentimeters(duration);

  Serial.print(inches);
  Serial.print("in, ");
  Serial.print(cm);
  Serial.print("cm");
  Serial.println();

  delay(100);
}

long microsecondsToInches(long microseconds) {
  // According to Parallax's datasheet for the PING))), there are 73.746
  // microseconds per inch (i.e. sound travels at 1130 feet per second).
  // This gives the distance travelled by the ping, outbound and return,
  // so we divide by 2 to get the distance of the obstacle.
  // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf
  return microseconds / 74 / 2;
}

long microsecondsToCentimeters(long microseconds) {
  // The speed of sound is 340 m/s or 29 microseconds per centimeter.
  // The ping travels out and back, so to find the distance of the object we
  // take half of the distance travelled.
  return microseconds / 29 / 2;
}

 

Question 2: Code is often compared to following a recipe or tutorial.  Why do you think that is?

On a personal level, I think for a computer, code is like a tutorial that instructs it to reach a task step by step, a way of us telling it what to do. For example, we let the LED “blink” or I may say to turn it on for a second and then turn it off for one second, repeatedly, in our previous seminar class. The code makes the computer first set a digital pin as an output,  puts a high voltage on the LED, then rests, puts a low voltage, rests again. The computer does what the “blink” code says and makes spark repeatedly.

Question 3: In Language of New Media, Manovich describes the influence of computers on new media. In what ways do you believe the computer influences our human behaviors?

Computers mainly play the role as “helpers”. So firstly, it let us go out less, since we can get things done remotely by computers, especially when we have the Internet, which is really convenient. Secondly, with keyboard, we type rather than hand-writing. Thirdly, it brought out many other things like the Internet and AI, so that they boost economic growth, military development, etc. Those are the three ways that I think of now, there are certainly a lot more influences.