Gloria’s Final Project Documentation Blog
PROJECT TITLE – YOUR NAME – YOUR INSTRUCTOR’S NAME
Project title: FANtabulous
Group members: Gloria Liang, Maggie Wang
Instructor: Gottfried Haider
CONCEPTION AND DESIGN
Our project is an experiential game where a person interacts with a fan and a projection. A project on Chinese oil-paper umbrellas inspired us. In that display, players can experience the fun of using oil-paper umbrellas with the help of technology. This project is not only fun but also shows the charm of traditional Chinese culture to everyone. We decided to follow this idea and do an interactive experience with Chinese fans.
In my assumption, users move the butterflies by fanning the flowers on the projection to float down. But there is no sensor to feel the air flow rate, so I designed to use the curtain’s movement when fanning the fan to drive the flex sensor to control the flowers and butterflies. Users can also split the stone on the projection by splitting something, so I designed to use the distance sensor to feel the position of the user’s hand and thus control the position of the stone.
During the user test, many people gave feedback that they didn’t know how to trigger those sensors. So we added some text instructions. At the same time, in order not to reduce the fun of exploration, our text guidance is not written very straightforwardly, but to guide the user to discover. After the text was added, it was indeed easier for users to get started in the presentation.
FABRICATION AND PRODUCTION
In the process of completing this work, I was mainly responsible for designing the pattern on the projection and coding the Arduino and Processing. Maggie was responsible for borrowing the equipment. We connected the circuit and decorated the board together.
Frame setting up
Once we decided on the theme, I started to think of how to set up the whole scene. I felt that the black-and-white landscape painting style fit well with the ancient traditional Chinese fan culture, so I decided to set the whole scene in black and white. But since the pictures I found were in color, I turned them into black and white with the help of ps.
We will need multiple sensors and connections since we want the audience to be able to interact with the projection. I wanted to conceal these cables under the projection fabric since they would seem unsightly if they were exposed. As a result, we needed a projection cloth instead of trying to project it on the wall. I asked for Andy’s assistance in building a wooden frame the same size so I could place the projection cloth on it to stand it up.
Andy cut the wood strips for us and we glued them. We waited a night for the glue to become dry. (lot of thanks!!)
Then we used fishing lines to hoist the frame up to the ceiling pole. (Climbing ladders is really fun🤣)
We also took advice from a classmate and decorate our board with plastic flowers to make it look more beautiful
The choice of sensors
We designed actions such as fanning to make flowers fall, splitting rocks, and tapping snowflakes to make it snow. At first, it was really hard for me to choose what sensor to use to detect the wind of the fan. We tried to use a flex sensor but found that it was difficult to be bent by the fan. So I came up with the idea of using the shaking of the curtain to drive the flex sensor. I glued one end of the flex sensor to the fabric and fixed the other side to a piece of board so that the sensor could be easily driven to bend when the fabric moved.
In the design of the split stone, we first thought of using the distance sensor. But at first, I thought the distance sensor can directly sense the distance of hands, forgetting that there is a fabric in front of the block. Thanks to the professor’s advice, we put the distance sensor on the side of the wooden frame, so that there is no obstruction in the middle.
For falling snow, we intended to use the flex sensor again. However, it may also be triggered by fanning. Then we found that the vibrant sensor was sensitive to tapping, so we went with this one. We sticked a snowflake❄️ icon on the screen so that people can know where to hit. We also took Professor Eric’s advice and move it higher so tall people don’t need to bend to touch it. But a flaw is that this sensor sometimes isn’t that sensitive, so you need to hit the icon several times to make it snow😔.
Coding
For the butterflies, at first, I used the random function so that they could move when someone is fanning them. But in the user test, a user pointed out that it would be better if the butterflies change with the strength of wind. I took this advice and adjusted my code to: image(myMovie, 50, 500+z*10, 80, 80); z=-125+arduino_values[0];. (arduino_values[0] is the value detected by the flex sensor) So now the stronger the wind is, the higher they are.
The stone is made up of two pictures and when the distance sensor’s detected value is between 55 and 70(when a hand is near the middle of the stone), the two pictures will be separated. And I also added this if (millis()-startTime>=4000) { y=0; so that the pictures can go back automatically after 4 seconds.
After that, I tried to write a code about falling flowers but had no idea how to draw one. Then when I was surfing the internet, a tree with flowers on it which can be blown away by keypress came into my eyes. So I introduce part of that code in my coding. To make it work for me, I changed the condition “keypress” to arduino_values[0]<120 (So usually when someone fans the fabric, the flex sensor’s value is smaller than 120.)
This is what the tree drawn by the code I referred to looks like.
But the problem is that once all the flowers are off, we need to replay the whole program so that the next user can interact with it. The first solution I thought of was to regenerate the tree every time the flowers were all off, but Professor Gottfried provided a better way. That is, to let the flowers go back to their original place once they are out of the screen. With his help, the flowers now can “grow back” when they are fanned off, like magic! It’s mainly because the function PVector originalPos;, which enables the flowers to go back to their original places once they are out of the screen.
For the snow, the number of drops each time is 200. When achieves 300, the snow will stop until someone taps the vibration sensor again. All snowflakes are ellipses, with changing positions. The change of y for the snowflakes are random, which means that the speed of dropping for each snowflake is different. Underx = random(width); the snowflakes can appear anywhere. But we found that the snow would keep restarting if someone continually tapped the fabric. We wanted that it won’t react to people’s taps when it was already snowing. Thus, we asked for an LA’s help and he wrote
if (arduino_values[2]>30 && dropState == 1 ) {
dropState = 0;
}
if (arduino_values[2]>30 && dropState == 0) {
dropState = 1;
}
dropState=0 means no snow while dropState=1 means there is snow. Then I tested the scope of the vibration sensor when someone tapped it and then code it into Processing.
On top of these visual effects, we also inserted sound effects into the project. Maggie looked for the sound of leaves falling and stone cracking, and I inserted them into the coding.
To make sure these sounds don’t go on before the previous one has finished, I added && leafsound.isPlaying() == false and &&rock.isPlaying() == false; in the if() function so that only when there is no sound playing can they play.
Since we projected in the back, we need to mirror all the words on the screen. With the help of Professor Gottfried, we use pushMatrix();popMatrix(); to mirror the thing in between. I also took his advice and added a variable bar on the screen to show how strong the wind is.
To make it clearer for the users, I added some text instructions on the screen, using text() and textSize() functions.
CONCLUSIONS:
Our project aims to create an interactive experience for the fan, allowing the audience to experience the charm of fan culture in an immersive way.
I think our project met our expectations. The audience was able to guess to fan without our prompting during the interaction. Some audience members were even able to trigger all the sensors without explanation. This project is consistent with my definition of interaction in that it involves human exploration and the feedback that the project gives to human actions.
But there’s plenty of room for improvement. Sometimes the sensors are difficult to trigger, and sometimes they are suddenly triggered. This is a problem we need to solve. And the sensors sometimes react more slowly than the viewer. So maybe we need to find better sensors or change the way we use them to make them more sensitive.
From this project, I learned a lot of useful coding knowledge – some of which I googled online, and some of which I asked others for help. I’ve also been working on my error-finding skills. I often find in the process of improvement that something is not working. Sometimes it’s a loose wire, sometimes it’s accidentally deleting an extra line of code, and sometimes it’s just the program itself. Now I am not as nervous about problems as before, because I believe I can always find a solution in the end.
Not only that, but I also learned the importance of listening to feedback. Since I designed this project myself, I know all the principles. But other people may have their own guesses about how the project will work, so I’ve observed that everyone uses the fan in a different way. At the IMA show, some people told me they thought it was infrared to detect their movements, and some thought there was a wind speed sensor. In the user test, many people pointed out that they did not think the prompts were clear enough and did not know what to do. But some people think it’s better not to explain, to experience the fun of exploration. There was a wide range of ideas, and in listening to them, I opened my mind to possibilities I hadn’t imagined before. At the same time, I also experienced the joy and pride of introducing my work to others.
APPENDIX
A complete display
wiring diagram
The code:
Arduino
import processing.video.*;
import processing.serial.*;
import processing.sound.*;
PImage photo1;
PImage photo2;
PImage photo3;
Serial serialPort;
SoundFile sound;
SoundFile rock;
SoundFile leafsound;
int NUM_OF_VALUES_FROM_ARDUINO = 3;
int arduino_values[] = new int[NUM_OF_VALUES_FROM_ARDUINO];
float startTime;
int prev_values[] = new int[NUM_OF_VALUES_FROM_ARDUINO];
float dropX, dropY;
float x=0;
int y=0;
int z=0;
int dropCount = 200;
int dropFrame = 0;
Movie myMovie;
color dropColor = color(255);
ArrayList<Branch> branches = new ArrayList<Branch>();
ArrayList<Leaf> leaves = new ArrayList<Leaf>();
int maxLevel = 9;
Drop[] drops = new Drop[dropCount];
int dropState = 0; // 0 not raining 1 raining
void setup() {
size(1200, 1000);
background(0);
rock = new SoundFile(this, “stonespilt.mp3”);
leafsound = new SoundFile(this, “leafsound.mp3”);
photo1 = loadImage(“mountain.png”);
photo2 = loadImage(“stone1.png”);
photo3 = loadImage(“stone2.png”);
printArray(Serial.list());
serialPort = new Serial(this, “/dev/cu.usbmodem1101”, 9600);
myMovie = new Movie(this, “butterfly.mp4”);
myMovie.loop();
colorMode(HSB, 100);
generateNewTree();
for (int i = 0; i < drops.length; i++) {
drops[i] = new Drop(random(width), random(-200, -50));
}
}
void draw() {
background(0);
getSerialData();
//pushMatrix();
translate(width, 0);
scale(-1, 1);
if (myMovie.available()) {
myMovie.read();
image(photo1, 0, 100, 1200, 600);
image(photo2, 400-y, 280, 300, 400);
image(photo3, 650+y, 280, 300, 400);
image(myMovie, 50, 500+z*5, 80, 80);
image(myMovie, 150, 500+z*5, 130, 130);
image(myMovie, 100, 400+z*5, 115, 115);
if (arduino_values[0]<120) {
z=-125+arduino_values[0];
}
if (arduino_values[0] < 120 && leafsound.isPlaying() == false) {
leafsound.play();
}
if (arduino_values[1]>55 && prev_values[1] >= 50 && arduino_values[1]<70 &&rock.isPlaying() == false) {
y=200;
rock.play();
startTime=millis();
}
if (millis()-startTime>=4000) {
y=0;
}
for (int i = 0; i < branches.size(); i++) {
Branch branch = branches.get(i);
branch.move();
branch.display();
}
for (int i = leaves.size()-1; i > -1; i–) {
Leaf leaf = leaves.get(i);
leaf.move();
leaf.display();
leaf.destroyIfOutBounds();
}
}
if (arduino_values[0]<120) {
for (Leaf leaf : leaves) {
PVector explosion = new PVector(leaf.pos.x, leaf.pos.y);
explosion.normalize();
explosion.setMag(0.01);
leaf.applyForce(explosion);
leaf.dynamic = true;
}
// copy the current values into the previous values
// so that next time in draw() we have access to them
for (int i=0; i < NUM_OF_VALUES_FROM_ARDUINO; i++) {
prev_values[i] = arduino_values[i];
}
}
if (arduino_values[2]>30 && dropState == 1 ) {
dropState = 0;
}
if (arduino_values[2]>30 && dropState == 0) {
dropState = 1;
}
if ( dropState ==1 ) {
for (int i = 0; i < drops.length; i++) {
drops[i].display();
drops[i].update();
}
dropFrame = dropFrame + 1;
}
if (dropFrame == 300) {
dropFrame = 0;
dropState = 0;
}
//popMatrix();
pushMatrix();
fill(255, 0, 255);
textSize(30);
text(“Explore the 3 triggers in this picture!”, 10, 50);
rect(200, 800, (138-arduino_values[0])*2, 10);
noFill();
stroke(255, 0, 255);
rect(200, 800, (138-70)*2, 10);
//textSize(20);
//text(“Wind Strength”, 860, 910);
if (arduino_values[0]<120) {
textSize(20);
//fill(129,99,99);
text(“They are flying!”, 140, 600);
text(“They are falling!”, 1000, 840);
} else {
textSize(20);
text(“Help the butterflies fly!”, 140, 600);
text(“Fan off the leaves!”, 1000, 840);
}
if (arduino_values[1]>60 && prev_values[1] >= 50 && arduino_values[1]<70) {
//text(“Good job!”, 500, 450)
} else {
text(“Spilt the stone!”, 500+x, 450+x);
x=random(-3, 3);
}
popMatrix();
}
// based on URL_ https://blog.csdn.net/weixin_38937890/article/details/95176710 (with modifications)
void generateNewTree() {
branches.clear();
leaves.clear();
float rootLength = 150;
branches.add(new Branch(width/1.2, height, width/1.2, height-rootLength, 0, null));
subDivide(branches.get(0));
}
void subDivide(Branch branch) {
ArrayList<Branch> newBranches = new ArrayList<Branch>();
int newBranchCount = (int)random(1, 4);
switch(newBranchCount) {
case 2:
newBranches.add(branch.newBranch(random(-45.0, -10.0), 0.8));
newBranches.add(branch.newBranch(random(10.0, 45.0), 0.8));
break;
case 3:
newBranches.add(branch.newBranch(random(-45.0, -15.0), 0.7));
newBranches.add(branch.newBranch(random(-10.0, 10.0), 0.8));
newBranches.add(branch.newBranch(random(15.0, 45.0), 0.7));
break;
default:
newBranches.add(branch.newBranch(random(-45.0, 45.0), 0.75));
break;
}
for (Branch newBranch : newBranches) {
branches.add(newBranch);
if (newBranch.level < maxLevel) {
subDivide(newBranch);
} else {
float offset = 5.0;
for (int i = 0; i < 5; i++) {
leaves.add(new Leaf(newBranch.end.x+random(-offset, offset), newBranch.end.y+random(-offset, offset), newBranch));
}
}
}
}
class Branch {
PVector start;
PVector end;
PVector vel = new PVector(0, 0);
PVector acc = new PVector(0, 0);
PVector restPos;
int level;
Branch parent = null;
float restLength;
Branch(float _x1, float _y1, float _x2, float _y2, int _level, Branch _parent) {
this.start = new PVector(_x1, _y1);
this.end = new PVector(_x2, _y2);
this.level = _level;
this.restLength = dist(_x1, _y1, _x2, _y2);
this.restPos = new PVector(_x2, _y2);
this.parent = _parent;
}
void display() {
stroke(10, 30, 20+this.level*4);
strokeWeight(maxLevel-this.level+1);
if (this.parent != null) {
line(this.parent.end.x, this.parent.end.y, this.end.x, this.end.y);
} else {
line(this.start.x, this.start.y, this.end.x, this.end.y);
}
}
Branch newBranch(float angle, float mult) {
PVector direction = new PVector(this.end.x, this.end.y);
direction.sub(this.start);
float branchLength = direction.mag();
float worldAngle = degrees(atan2(direction.x, direction.y))+angle;
direction.x = sin(radians(worldAngle));
direction.y = cos(radians(worldAngle));
direction.normalize();
direction.mult(branchLength*mult);
PVector newEnd = new PVector(this.end.x, this.end.y);
newEnd.add(direction);
return new Branch(this.end.x, this.end.y, newEnd.x, newEnd.y, this.level+1, this);
}
void sim() {
PVector airDrag = new PVector(this.vel.x, this.vel.y);
float dragMagnitude = airDrag.mag();
airDrag.normalize();
airDrag.mult(-1);
airDrag.mult(0.05*dragMagnitude*dragMagnitude);
PVector spring = new PVector(this.end.x, this.end.y);
spring.sub(this.restPos);
float stretchedLength = dist(this.restPos.x, this.restPos.y, this.end.x, this.end.y);
spring.normalize();
float elasticMult = map(this.level, 0, maxLevel, 0.1, 0.2);
spring.mult(-elasticMult*stretchedLength);
}
void move() {
this.sim();
this.vel.mult(0.95);
if (this.vel.mag() < 0.05) {
this.vel.mult(0);
}
this.vel.add(this.acc);
this.end.add(this.vel);
this.acc.mult(0);
}
}
class Leaf {
PVector pos;
PVector originalPos;
PVector vel = new PVector(0, 0);
PVector acc = new PVector(0, 0);
float diameter;
float opacity;
float hue;
float sat;
PVector offset;
boolean dynamic = false;
Branch parent;
Leaf(float _x, float _y, Branch _parent) {
this.pos = new PVector(_x, _y);
this.originalPos = new PVector(_x, _y);
this.parent = _parent;
this.offset = new PVector(_parent.restPos.x-this.pos.x, _parent.restPos.y-this.pos.y);
if (leaves.size() % 5 == 0) {
this.hue = 2;
} else {
this.hue = random(75.0, 95.0);
this.sat = 50;
}
}
void display() {
noStroke();
fill(this.hue, 40, 100, 40);
ellipse(this.pos.x, this.pos.y, 5, 5);
}
void bounds() {
if (! this.dynamic) {
return;
}
float ground = height-this.diameter*0.5;
if (this.pos.y > height || this.pos.x > width || this.pos.y < 0 || this.pos.x < 0) {
this.vel.y = 0;
//this.vel.x *= 0.95;
this.vel.x = 0;
this.acc.x = 0;
this.acc.y = 0;
//this.pos.y = ground;
this.pos = this.originalPos;
this.dynamic = false;
}
}
void applyForce(PVector force) {
this.acc.add(force);
}
void move() {
if (this.dynamic) {
PVector gravity = new PVector(0, 0.005);
this.applyForce(gravity);
this.vel.add(this.acc);
this.pos.add(this.vel);
this.acc.mult(0);
this.bounds();
} else {
this.pos.x = this.parent.end.x+this.offset.x;
this.pos.y = this.parent.end.y+this.offset.y;
}
}
void destroyIfOutBounds() {
if (this.dynamic) {
if (this.pos.x < 0 || this.pos.x > width) {
leaves.remove(this);
}
}
}
}
float distSquared(float x1, float y1, float x2, float y2) {
return (x2-x1)*(x2-x1) + (y2-y1)*(y2-y1);
}
class Drop {
float x, y, ySpeed;
Drop(float x_, float y_) {
x = x_;
y = y_;
ySpeed = random(5, 15);
}
void display() {
noStroke();
fill(dropColor);
ellipse(x, y, 5, 10);
}
void update() {
y += ySpeed;
// If the raindrop reaches the bottom of the screen, reset its position
if (y > height) {
y = random(-200, -50);
x = random(width);
}
}
}
void getSerialData() {
while (serialPort.available() > 0) {
String in = serialPort.readStringUntil( 10 ); // 10 = ‘\n’ Linefeed in ASCII
if (in != null) {
print(“From Arduino: ” + in);
String[] serialInArray = split(trim(in), “,”);
if (serialInArray.length == NUM_OF_VALUES_FROM_ARDUINO) {
for (int i=0; i<serialInArray.length; i++) {
arduino_values[i] = int(serialInArray[i]);
}
}
}
}
}
Acknowledgments: Code for Processing made use of Cherry blossom fractals by Jason Labbe, retrieved from URL – https://blog.csdn.net/weixin_38937890/article/details/95176710 which was used with added modifications.