“Don’t Forget” – An Exploration in Liminal Spaces and Nostalgia

Final Project Documentation – Interaction Lab (S24 Andrew Garcia)

Conception and Design

The earliest ideas for this project were very different. I first thought about things I wanted to do, were interested in, or wanted to convey. I was inspired by a midterm project one of my friends made, which was based on the concept of an escape room and incorporated a story into the gameplay. My first idea was similar to this, since I wanted to make something with a mystery theme, and thus came up with a detective game where the player had to complete multiple puzzles to “solve” a mystery, interacting with both hardware and software components. I lacked inspiration for my second idea, which was just a musical instrument-like project, except instead of using tin cans or other similar objects as the keys or buttons, I wanted to use tiny stuffed animals and flex sensors to give more range of control. This was like a combination of a coke can piano and the Nike shoe videos that I saw before. 

The third idea featured a concept I was personally interested in: finding individuality. Although this theme is often used in stories, movies, and similar media, I didn’t think it would be a tired concept if I incorporated my own ideas into a representation of it. I based the third idea on the concept of trying to make the player realize the “objective” of individuality through letting them explore on their own. Through showing various sounds and visuals that the player would believe they had to recreate exactly, they would first be led to think that the objective of the game was to copy the examples all the way through. But as the stages progressed, my intention was to make them harder to copy and to somehow encourage the player to instead freely create their own drawings and melodies, whether it be adding on to the original or making something completely unique. 

[insert sketch of 3 ideas]

I ended up choosing this last idea, since it had the deepest “story.” However, the next step in the creative process proved difficult. 

I had ideas for this early concept of my project but struggled with how to implement it. I didn’t think that my initial plan for the experience was creative or “outside the box” enough. I wanted a physical implementation that would match the deepness of the concept but somehow also convey the concept clearly to the players. I was thinking of about 3-5 stages, each with their own controls representing something different as a simple shape, image, or tune would be displayed in Processing for the players to copy. This later on changed to 5 stages that each represented a major life stage, with controls taking on the form of something representative of life stages, for example a hanging baby mobile for the baby stage. As with any artistic project, it’s difficult to make audiences understand its complexity to the extent you want them to while also thinking of how to capture and engage their attention (overly complex concepts become niche). At the same time, I also needed to consider how to plan something within a realistic time frame and my own scope of ability. If I started with something that really matched the complexity of the concept, I wouldn’t have been able to finish. According to the suggestion of Andy, I reduced the number of stages in my game to just 3, with a more “universal” controller that could be used across all stages. Had I created a different controller for each stage, it would’ve almost been like making 3 separate different projects. 

Andy introduced me to the IMU sensor, which is like a combination of an accelerometer and gyroscope, as it senses both changes in speed and direction. Inspired by the flexibility of the sensor and how it could take in 6 different variables, I thought of making a ball controller with the IMU sensor inside. Its function could change according to each stage, for example, being used to control a moving shape for one and the volume and pitch of a tune for another. This meshed well with the intention to use the same controller for different uses. 


Setup of the IMU sensor according to the HowToMechatronics tutorial 

I came up with ideas three basic stages: 

Stage 1: Mouse cursor, which draws lines (image is saved after 30 seconds) 
Stage 2: Based on how player shakes the ball, affects the notes being played
Stage 3: Shapes (circles, squares) painting the camera image, which gets locked after 30 seconds

For my design decisions, I do think that both here and when I was still coming up with ideas for the proposal that I was still being influenced by my midterm project by thinking in “threes.” My midterm project followed the structure of three basic steps + a result, which was similar to this idea. Additionally, at this point, the concept for my project shifted again, to involving memories. I settled on the idea that inside the game, there was a person who was slowly forgetting several important memories, and it was up to the player to prevent this from happening. 

According to this theme, I chose background images for each stage that looked like liminal stages and/or had the feeling of hazy, nostalgic memories. There is a category of images that specifically evokes in the viewer feelings of familiarity or their childhood. The images I found were from a Twitter profile called “recovered_file,” posting such images as if they had been “recovered” from an old device, with the feeling of “Oh, I’ve seen this image before — I took it six years ago when I did such-and-such.” 

Some moodboard photos: 

 

For example, one of the images I intended to use had the text “promise you won’t forget me, ever.” This was supposed to mean that while the player is led to believe that they’re saving the memories of someone else, but it’s actually their own memories that they’re slowly recovering as they play. In the end, the screen shows a camera feed of their own face, which reveals the player is actually “finding” themselves again. The saying “Promise you won’t forget me, ever” means that the player shouldn’t forget themselves or who they are as an individual. 

Additionally, I knew that the atmosphere of the experience depended on elements besides just the visuals. I searched for background music from a website called Sample Focus, which had royalty-free sound clips recorded and uploaded by users. The start screen just uses TV static noises to match the animations, and I chose vibey, synth-like sounds for Stages 1 and 3 to further incorporate the hazy feeling of the stages. I play the cello and know how it can sound melancholic and thoughtful at the same time with its deep timbre, so I found simple audio clips to use to be played for Stage 2.

By the day of user testing on 4/26, I had made the controller using laser cutting and put the IMU sensor inside. I had also completed one of the stages, which involved a live camera feed that changed the pixels into circles or squares based on mapped variables from the IMU sensor. 


Putting the controller together 

This is some of the notes I took based on user testing feedback: 

  • Show the story more clearly
    • Include start or transition screens but not explicit instructions
  • Maybe record a video at first and replay this video throughout the stages instead of using a live camera feed
  • 3D print a bearing frame to secure circuit inside the box
  • For the camera feed, even after the frame clears up, have it turn blurry again after some time? 
  • Use the IMU sensor’s other variables in a different way
  • Have only the camera feed screen instead of other different stages. Many different features and controls to represent the different ways/faces we present to others
  • Box: edges are rough
    • Cover w 3D pieces? LED inside
  • Stages represent sight vs. sound? + what other?
  • Remove/hide the laptop, which doesn’t match the visuals of the concept
    • Use a projector to project onto a cardboard surface or a different monitor?

I received this feedback in an early stage of my project, so it was really helpful as I still had more chances to change its direction. It did indeed impact one of my design decisions, as I decided to make a cover to hide my laptop. Without the feedback, I wouldn’t have considered how the laptop didn’t fit well visually compared to the overall theme. I also made sure to secure the jumper cables connected to the pins in the Arduino and the sensor using electrical tape so that they wouldn’t fall out as the player moved the box around. This did happen during user testing, and I took note of it because I also planned to incorporate a feature where the player would also have to shake the box. I also secured the sensor and the Arduino themselves inside the box with tape. 

Fabrication and Production

I had to make the interactions in my project simpler than originally planned for, because writing the code for each part took more time than expected. Some time was even spent on setting up the IMU sensor, since it was my first time using it. Coding sometimes felt like a black box, since on multiple occasions, code that worked before would arbitrarily break and have problems, and code that theoretically was supposed to work, wouldn’t. All of this factored into drawing out the production process, even though I planned for it to be relatively straightforward and simple. After familiarizing with the variables and how the sensor worked, I coded each stage in separate Processing file and combined it into the same file at the end. 

In the beginning, when I created the box controller for my project, instead of a ball as I initially planned, I made it into an octahedral shape. This was coincidentally because I couldn’t find any spherical SVG laser cutting templates online that I liked; there was no particular reason. I would’ve still gone with a ball controller if there were good sphere templates. I just wanted to represent a controller that had no right “answer” or orientation and multiple possibilities, which the octahedron shape conveyed just as well as a sphere.

I made additional edits to the octahedron template by drawing a clock on one of the sides in Cuttle along with some circles (similar to the ones on the face of the clock) and time-related quotes on the other sides. This made the controller more visually interesting had the sides just been blank. 

I chose frosted acrylic for the material of the octahedron box, which had the “hazy” effect, which I thought fit well with the “hazy” memory theme. This happened to work out, because when went to I laser cut the pieces for the octahedron box, I initially thought of making it out of wood. There was no conscious decision behind the wood choice, it just felt default to me. However, when I went over to the material shelf, I noticed that there were more choices than I imagined, with different types of clear, frosted, and colored acrylic. I thought it would be more interesting if the box was acrylic instead of wood, so I chose frosted acrylic for the box material.


Laser cutting the pieces for the octahedron box 

Conversely, if I had gone with a wooden box controller, initial cardboard laptop cover I used for the final presentation might’ve matched, both having a more organic feeling. But since the box controller was an acrylic material, I received feedback that the cardboard cover looked a little unpolished, so I remade it with some transparent black PVC sheets and black cardstock for the IMA show. This matched the more “cybercore” style of the controller.


The look of the project for the final presentation 

Additionally, I added some black and mesh sparkly cloth scraps to the inside of the box just to hide the Arduino and sensor. The acrylic was opaque with the hardware inside still partially visible which didn’t look good, so the cloth was a quick fix for the visuals. 

I also borrowed an external camera from the equipment room instead of using my laptop’s built-in camera. The latter had issues and would sometimes stop working, so to avoid extra problems, I used an external camera connected to one of my laptop’s USB ports for my project during all stages of user testing, final presentation, and the IMA show. It was also easy to write the code for switching to an external camera in Processing, as I referred to code from a lecture slide from class and changed the camera array index from [0] (internal camera) to [1] (external camera).

For Stage 1, I decided for it to be a simple introductory stage for the user to familiarize with the controls of the box, so I drew a dot on the screen that they could move around. The certain variables from the sensor were mapped to the x and y positions of the dot, and I was able to write this part of the code after exploring and finding the general ranges of the sensor. I then drew several spots on the background image, one of which was the “correct” one that the player had to touch with their own dot to progress to the next stage.

 
Demonstration of sensor-to-dot movement for Stage 1

Meanwhile, I wanted to convey confusion and a muddleheaded kind of feeling through the background music, which I played throughout the entire stage. The main difficulty of this stage was use of the distance equation, which I hadn’t worked with before. I found a comprehensive tutorial on how to sense the overlap of certain shapes with each other in Processing, which I was able to incorporate into the stage. I also discovered that when I mapped the pitch and roll values from the IMU sensor to the x and y values for the dot, there was some inherent difficulty to the controls — it wasn’t straightforward or intuitive. This worked well with my intentions, because I didn’t want to make the controls too easy, so I didn’t adjust the mapping or anything and left it as-is. 

Since Stage 1 was relied heavily on visual cues, I wanted to make Stage 2 based on sound. I chose a slightly brighter looking background image of a grassy field with bubbles to represent that after leaving that first room in Stage 1, the mindset and atmosphere of the player was improving. I mapped the acceleration x y and z variables to the playback speed, amplitude, and panning of a cello sound clip playing throughout the stage. If the amplitude reached less than 0.7, the audio would pause, and if the player maintained this pause for over 5 seconds, they would successfully progress to Stage 3. Shaking the box aggressively would satisfy these requirements. I also animated bubbles based on one of the lessons for moving circles in Processing we saw in class to add another dynamic layer onto the bubbles in the background image.


The bubble animation in Stage 2 

The circle that the player was controlling was also a bubble, its size and position being controlled by IMU sensor values. Stage 2 was a combination of both visuals and audio aspects being controlled by the player. Stage 2 mostly went according to plan for me. I considered adding a feature where the bubble being controlled by the player would push the other automated bubbles out of the way, making the interaction even more visually interesting and fun. But since at that point I hadn’t finished other parts of the project, writing the code for this wasn’t a priority. In the end, I didn’t get around to implementing it. 

I have some regrets with Stage 3, since it broke after I combined its code into a single file with the other stages. In its individual file, it had an image with the text “promise you won’t forget me, ever,” which was supposed to represent the project’s message to the player. The image would then fade to reveal the live camera feed that would have circle or square pixels constantly shifting depending on the orientation and speed of the box controller. If the roll and pitch of the sensor reached certain values, then the circles or squares would appear. In a specific range, the camera feed would clear up and look like a normal video. I completed the code for this stage first referring to an example from class lecture slides and used millis() to time the fading of the image on top. This stage served as the conclusion to the project, showing the player’s own appearance and delivering the epiphany that they were recovering their own memories as they played. 

The start screen was an additional part I added later on. After I decided to make a cover that looked like an old TV or PC monitor for the laptop, I thought of TV static screens that I sometimes saw when I turned to the wrong channel as a kid.


Demonstration of various TV static screen animations 

The start screen was technically similar to Stage 1, with the player controlling the x and y position of a circle over a TV static animation. If the circle went over a square for an extended amount of time, it would change the TV static animation to other variations. For every 5 seconds that the circle touched the square, it would switch to the next TV channel. Once the player got to Channel 10, it would progress to the first stage. This was also kind of an introductory level that I learned a lot from as I wrote the code for it. 

Lastly came the combination of the start screen + all the stages into one Processing file. This was quite scary, since I knew certain parts of the code that worked independently might not interact well if they were in the same file. It was extremely time-consuming, because as mentioned with Stage 3, I encountered lots of issues with this step. I was initially planning to use states like I did with writing a flowerBloom() state for my midterm project, but my final project worked slightly differently. Some parts of each stage only needed to run once, while other parts had to be looped, so I couldn’t directly paste each part one after another. I realized this problem when the background music would overlap with each other (overtriggering, a problem we were warned of in class) or the dots would leave a trail (the background image wasn’t being drawn repeatedly) I sectioned off each stage into their run-once and run-repeatedly parts and used a variable that changed for each stage if-statements to transition between stages. The start screen and stages 1 and 2 survived, but Stage 3 unfortunately stopped working when I combined it into the main file. The image no longer displayed at the beginning, and the camera feed was just a frozen screen. I wasn’t able to fix this in the end and just went with the third stage being a photo taken of the player, but the rest of the project worked well. 

From the beginning, I wrote the code in separate parts to make it personally easier to understand and test, but next time, I might try writing everything in the same file from the start to avoid additional problems popping up in the later stages of production. 

Overall, the production process was very much learn-as-I-go. I shaved off many planned features as I encountered troubles and time constraints. Many parts of the stages have a “first draft” feeling since I didn’t make any additional edits or polishes; as soon as I got it to work, I had to move onto making the next part. 

 

Conclusions

The main goal of my project was to convey a sort of abstract, nostalgic feeling to the players. Although this had evolved quite dramatically from my initial ideas, I’d say I was able to partially communicate this concept to players during my final presentation and the IMA show. However, this was because during my presentation, I explained the general idea beforehand, and during the IMA show, I explained the brainstorming and ideas to players after they experienced my project. I would’ve liked to make the concept even clearer to the point that less explaining would’ve been needed, and the players would’ve had their own takeaways. I still say that I partially succeeded because it was understood from the types of images I used, and when I shows my project to players during the IMA show, some people recognized that the design of the laptop cover emulated the feeling of a retro monitor. They also appreciated the images I chose and acknowledged the nostalgic vibe I was trying to evoke. 

Players interacting with my project during the IMA show: 

 

I learned greatly from the design and conception process in the beginning. My personal work ethic depends greatly on motivation, which I lacked since I didn’t have confidence or satisfaction with my ideas early on. I didn’t want to start working on components of a project that I felt like I was going to have to change later on anyway, but at the same time, I had a creative block, so I didn’t feel inspired even while I wanted to continue brainstorming. Thus, I got a late start to my project. 

It got to the point that I had no choice but to start working on it, even though I wasn’t fully satisfied with the ideas I had. Contrarily, I was inspired as I worked. I found new ideas as I searched for references and tutorials, and even as I received feedback from user testing. It became a situation where the concept wasn’t fully developed until well into the working process. Sometimes, the most important thing is to just get started, even if you lack motivation and inspiration. This was especially significant for me, since it hindered my progress at the beginning.

Major areas for improvement that I considered were to add more text and hints to each stage so that there was less confusion on the player’s side. While I did want to have the players explore on their own instead of being told how to progress through each stage, I think I could’ve made it slightly clearer through small, indirect text hints such as “Where am I? What are these dark spots? I should look around…” for the first stage.

This was my personal opinion, but I felt like the background music for each stage could’ve been longer and more pleasant. When players got stuck on a stage, the sound constantly looping could become tiring to hear. Especially for Stage 2, where the box controlled the speed and volume of the music, what happened was the audio became sped-up and similar to a “mosquito buzz” as one of my friends pointed out when he test-played. While I didn’t want the music to be comforting, I didn’t want it to be annoying either. 

Finally, one of the feedbacks I received during my final presentation was that each dark spot in Stage 1 could lead to a different place on the screen. This would add to the replay value of the project, making the player want to return and thoroughly explore each route. It would’ve become like a mystery with some slight escape room-like elements, which also tied to another one of my initial project ideas. I really liked this suggestion and definitely would’ve implemented it if I had more time. I was just a little worried with the technical aspect and whether I could have the stages transition back and forth smoothly, since I knew I struggled with writing the states for each stage. 

Another small detail I took note of relating to the experience itself was that I hadn’t anticipated the loud environment during the IMA show. When I turned my laptop to full volume, it was still difficult to hear the background music in my project. I noticed some other groups connected their projects to speakers, which I think is a good idea for the next time I might need to present something that involves interaction with sound. 

I’d say there was a unique interactivity to my project. Players interacted with my project when they saw the animations on the screen and heard the sounds from the laptop react to the movement of the box, which they were controlling through tilting and shaking. It was constant visual and aural feedback, with the controls changing for each stage, so the players essentially had to “re-learn” how to interact with the project each time. This does align with my own definition of interaction, since players had a constant back and forth “conversation” instead of a one-and-done experience. 

One of the things I learned was that asking for help also takes time. You need to make sure to reach out to Learning Assistants or professors during their available hours. If you’re up late working in the IMA studio, it becomes an issue coming across an obstacle when there’s no one with more experience who could help. Thus, it’s important to plan out which parts you anticipate being able to complete on your own vs. which parts you’re struggling with. Otherwise, you might realize that you’ve only encountered a problem outside of the LAs’ and professors’ working hours. 

As with what I learned from working on my midterm project as well, it’s all a matter of experience. Code and the creation of hardware that sound simple to implement might still prove difficult and time-consuming, especially if it’s your first time doing it, because more often than not, you’ll encounter unexpected problems along the way. In an ideal time frame, I would’ve remade my entire final project again from scratch, including the hardware (octahedron box) and rewriting the code from a blank file, because I could do so knowing which pitfalls and mistakes to avoid. The overall project would come out more polished and with inherent improvements being produced by experienced hands. 


The final look of the project 

Disassembly

 

Appendix

Google document I used to record ideas, resources used, inspiration/reference images as I worked

Materials List

  • IMU sensor x1
  • M/M, M/F jumper cables
  • Frosted acrylic
  • Black and mesh cloth scraps with sparkles
  • Arduino to laptop USB cable, laptop
  • Electric tape
  • Black PVC
  • Black cardstock

References and resources used: 

Processing Code

/* Isabel Chen S24 Interaction Lab Final Project

References: 
tv static effect in start screen: https://forum.processing.org/two/discussion/6446/television-static.html
stage 3 camera: example taken from class lecture slides
*/

//setup
import processing.serial.*;
Serial serialPort;
String state;

int NUM_OF_VALUES_FROM_ARDUINO = 9;  /* CHANGE THIS ACCORDING TO YOUR PROJECT */

/* This array stores values from Arduino */
int arduino_values[] = new int[NUM_OF_VALUES_FROM_ARDUINO];
//[0]: roll
//[1]: pitch

import processing.sound.*;
SoundFile tv_static;

//start screen setup -----------------------------------------
int mouseClicks;
int channel;

PVector v;
PVector p;
PVector rx;
PVector rz;

int C = 700;
int LIM = C - 1;
int posrange = 64;
int subrange = 32;

float angx = 0; // degrees
float angz = 0; // degrees

int staticAmount = 16000; //how much static

float cx = 0;      // circle position 
float cy = 0;
float r = 30;      // circle radius

float sx = 525;    // square position
float sy = 600;
float sw = 50;    // and dimensions
float sh = 50;

void rotz(PVector vec, PVector ro) {
  float x = vec.x*ro.x - vec.y*ro.y;
  float y = vec.x*ro.y + vec.y*ro.x;
  vec.x = x;
  vec.y = y;
}

void rotx(PVector vec, PVector ro) {
  float z = vec.z*ro.z - vec.y*ro.y;
  float y = vec.z*ro.y + vec.y*ro.z;
  vec.z = z;
  vec.y = y;
}
// start screen setup end --------------------------------------------

void setup() {
  
  size(1199, 787);
  
  printArray(Serial.list());
  serialPort = new Serial(this, "COM11", 9600);
  
  state = "START";
  
  tv_static = new SoundFile(this, "tvStatic.mp3");
  //play tv static sound effect on loop
  tv_static.loop();
  
  angx = 321;
  angz = 280;
  v = new PVector(-17, -19, 26);
  
  //Using vectors of unit length will let us rotate without changing speed
  rz = new PVector(cos(angx*PI/180), sin(angx*PI/180), 0);
  rx = new PVector(0, sin(angz*PI/180), cos(angz*PI/180));
  p = new PVector(0, 0, 0);
  
  frameRate(30);
  
}

void draw() {
  
  if (state == "START") {
    
      background(20);
      
      getSerialData();
      
      textSize(20);
      fill(250);
      text("hel p", 640, 736, 580, 320);
      text("i      dont want to fo rget", 960, 666, 280, 320);
      
      textSize(30);
      text(">>  C H A N N E L   "+mouseClicks/10, 50, 50, 280, 320);
      
      stroke(255);
      //rect(width/2, 600, width/10, 70);
      
      // update circle pos to coordinates controlled by IMU
      cx = map(arduino_values[0], -88, 88, 0, width);
      cy = map(arduino_values[1], -88, 88, 0, width);
    
      // check for collision
      // if hit, change rectangle color
      boolean hit = circleRect(cx, cy, r, sx, sy, sw, sh);
      if (hit) {
        fill(100,150);
        mouseClicked();
      }
      else {
        fill(255, 150);
      }
      rect(sx, sy, sw, sh);
    
      // draw the circle
      fill(0, 150);
      ellipse(cx, cy, r*2, r*2);
      
      loadPixels();
    
      for (int i=0; staticAmount>i; i++) {
    
        rotz(v, rz);
        rotx(v, rx);
        
        p.add(v);
        
        if (p.x > LIM) p.x = p.x - LIM+1;
        if (p.y > LIM) p.y = p.y - LIM+1;
        if (p.z > LIM) p.z = p.z - LIM+1;
        
        if (p.x < 0) p.x = p.x + LIM;
        if (p.y < 0) p.y = p.y + LIM;
        if (p.z < 0) p.z = p.z + LIM;
        
        translate(width/2, height/2);
        
        //map x,y,z to somewhere in x,y only
        int counter = (int(p.z) << 12) | (int(p.y) << 6) | int(p.x);
        if ( (counter < pixels.length) && (counter >= 0)) {
          pixels[counter] = color(255);
      }
    }
    
    updatePixels();
    
    if (mouseClicks >= 102) {
      stage1();
      state = "STAGE1";
    }
  } else if (state == "STAGE1") {
    
    room = loadImage("IMG_9621.JPG");
    
    getSerialData();
    
    textSize(20);
    fill(50);
    text(stage1str, 640, 736, 580, 320);
    
    if (millis() - staTime >= 1500) {
      stage1str = "what is this room?";
    } 
  
    x2 = map(arduino_values[0], -70, 70, 0, width+50);
    y2 = map(arduino_values[1], -70, 80, 0, height+50);
    
    room.resize(1199, 787);
    image(room, 0,0);
    
    noFill();
    
    int c = 30;
    for (int i = 0; i < c; i++) {
      stroke(100, 255-(255 * i/c));
      ellipse(292, 390, i+10, i);
      ellipse(639, 413, i+10, i);
      ellipse(1009, 140, i+10, i);
      ellipse(1155, 485, i+10, i);
      
    }
    
    fill(255, 200);
    noStroke();
    int r = 30;
    ellipse(x2, y2, r, r);
    
    // calculate dist between control circle and target circle's centers
    float distX = x2 - 1155;
    float distY = y2 - 485;
    float distance = sqrt((distX*distX) + (distY*distY));
    
    // check if the dist between the circles is less than the sum of their radii
    if (distance <= c+r) {
      stage2();
      state = "STAGE2";
    }
    
  } else if (state == "STAGE2") {
  
    getSerialData();
    
    bubbleField = loadImage("IMG_9629.JPG");
  
    image(bubbleField, 0,0);
    
    for(int i=0; i < bubbles.length; i++){
        
         bubbles[i].x += bubbles[i].dX;
         bubbles[i].y += bubbles[i].dY;
         int d = bubbles[i].diameter;
         
         if(bubbles[i].x < -d) bubbles[i].x = width;
         else if(bubbles[i].x > width + 1) bubbles[i].x = -d;
         if(bubbles[i].y < -d) bubbles[i].y = height;
         if(bubbles[i].y > height) bubbles[i].y = -d;
         
         // bubbles
         fill(#4C6C3B, 10);
         stroke(240, 100);
         ellipse(bubbles[i].x, bubbles[i].y, bubbles[i].diameter, bubbles[i].diameter);
         
         // bubble highlights
         fill(255,255,255);
         stroke(255,255,255);
         ellipse((float)bubbles[i].x-d/4, (float)bubbles[i].y-d/4, d/8, d/8); 
      }
    
    float playbackSpeed = map(arduino_values[3], -2, 1.5, 0.25, 4.0);
    celloSound.rate(playbackSpeed); 
    
    float amplitude = map(arduino_values[4], -2, 1, 0.2, 1.0);
    celloSound.amp(amplitude);
  
    float panning = map(arduino_values[5], -1.5, 2, -1, 1);
    celloSound.pan(panning);
  
    if (playbackSpeed >= 3 && playbackSpeed <= 4 && celloSound.isPlaying() != true) {
      
      celloSound.play();
      
    } else if (amplitude < 0.7 && celloSound.isPlaying()) {
      
      celloSound.pause();
      pauseTime = millis();
    }
    
    if (pauseTime - startTime >= 5000) {
      
      stage3();
      state = "STAGE3";
    }
    
    float volume = analysis.analyze();
    float diameter = map(volume, 0, 1, 0, width);
    
    fill(#4C6C3B, 50);
    stroke(240, 100);
    
    float x = map(arduino_values[0], -88, 88, 0, width+30);
    float y = map(arduino_values[1], -88, 91, 0, height+30);
    
    circle((int)x, (int)y, diameter);
  
  } else if (state == "STAGE3") {
    
    //cam = new Capture(this, 1199, 787, cameras[0]);
    //cam.start();
    //noStroke();
    
    getSerialData();

    if (transparency > 0) {
      transparency -= 0.1;
    }
  
    tint(255, transparency);
    image(forgetMeNot, 0, 0);
  
    if (cam.available()) {
      cam.read();
    
    // normal camera image
    if (arduino_values[0]>40 || arduino_values[1] >-40) {
      pushMatrix();
      translate(cam.width, 0);
      scale(-1, 1);
      image(cam, 0, 0);
      popMatrix();
    }

  // painting with circles
  if (arduino_values[0]>0) {
    for (int i=0; i<100; i++) {
      int x=floor(random(width));
      int y=floor(random(height));
      int d=floor(random(5, 20));
      pushMatrix();
      translate(cam.width, 0);
      scale(-1, 1);
      fill(cam.get(x, y));
       circle(x, y, d);
      popMatrix();
    }
  }
  
  // painting with squares
  if (arduino_values[1]>0) {
    for (int i=0; i<100; i++) {
      int x=floor(random(width));
      int y=floor(random(height));
      pushMatrix();
      translate(cam.width, 0);
      scale(-1, 1);
      fill(cam.get(x, y));
      rectMode(CENTER);
      rect(x, y, 20, 20);
      popMatrix();
    }
  }
 }

  curTime = millis();
  if (curTime - staTime >= 60000) {
    state = "END";
  }
  
  } else if (state == "END") {
  
    // IMG_9620.jpg
  
  }
  
}

void mouseClicked() {
  
  mouseClicks++;
  
  v.set(random(posrange)-subrange, random(posrange)-subrange, random(posrange)-subrange);
  p.set(0, 0, 0); //start point could be random too
  
  if (random(1) > 0.5) { //not necessary but seems to change the behavior
    v.normalize();
  }
  
  angx = random(360); // degrees
  angz = random(360); // degrees
  
  rz.set(cos(angx*PI/180), sin(angx*PI/180), 0);
  rx.set(0, sin(angz*PI/180), cos(angz*PI/180));
}

boolean circleRect(float cx, float cy, float radius, float rx, float ry, float rw, float rh) {

  // temporary variables to set edges for testing
  float testX = cx;
  float testY = cy;

  // which edge is closest?
  if (cx < rx)         testX = rx;      // test left edge
  else if (cx > rx+rw) testX = rx+rw;   // right edge
  if (cy < ry)         testY = ry;      // top edge
  else if (cy > ry+rh) testY = ry+rh;   // bottom edge

  // get distance from closest edges
  float distX = cx-testX;
  float distY = cy-testY;
  float distance = sqrt( (distX*distX) + (distY*distY) );

  // if the distance is less than the radius, collision!
  if (distance <= radius) {
    return true;
  }
  return false;
}

// stage 1 setup ----------------------------------------------

  SoundFile roomSound;
  
  float x2;
  float y2;
  
  PImage room;
  
  boolean isRoomSoundPlaying = false;
  
  String stage1str = "where am i?";

// stage 1 setup end ----------------------------------------------

void stage1() {

  tv_static.stop();
    
  if (!isRoomSoundPlaying) {
    roomSound = new SoundFile(this, "dreamscape-whispers_90bpm_A_major.wav");
    roomSound.loop();
    isRoomSoundPlaying = true;
  }
  
}

// stage 2 setup ----------------------------------------------

SoundFile celloSound;

Amplitude analysis;

int startTime, pauseTime; // millis
PImage bubbleField;

boolean isCelloSoundPlaying = false;

class Bubble{
 
  public int x;
  public int y;
  public int diameter;
  public int dX;
  public int dY;
  
}

Bubble[] bubbles;

// stage 2 setup end ----------------------------------------------

void stage2() {
  
  roomSound.stop();
  
  startTime = millis();
  
  if (!isCelloSoundPlaying) {
    celloSound = new SoundFile(this, "cello-slow-progression-sad-melody_120bpm_A#_major.wav");
    celloSound.loop();
    isCelloSoundPlaying = true;
  }
  
  analysis = new Amplitude(this);
  analysis.input(celloSound);
  
  bubbles = new Bubble[10];
  
  for(int i=0; i < bubbles.length; i++){
    bubbles[i] = new Bubble();
    
    bubbles[i].x = (int)random(width);
    bubbles[i].y = (int)random(height);
    bubbles[i].diameter = (int)random(50,100);
    
    bubbles[i].dX = (int)random(3);
    bubbles[i].dY = (int)random(3);
    
 }
 
}

// stage 3 setup ----------------------------------------------

SoundFile mysticGroove;

PImage forgetMeNot;
float transparency = 255;

float staTime = millis();
float curTime;

//boolean isForgetMeNotPlaying = false;

import processing.video.*;
String[] cameras = Capture.list();
Capture cam;

// stage 3 setup end ----------------------------------------------

void stage3() {
  
  celloSound.stop();
  
    mysticGroove = new SoundFile(this, "mystic-groove-dreamscapes_92bpm_C#_minor.wav");
    mysticGroove.loop();

  cam = new Capture(this, 1199, 787, cameras[1]);
  cam.start();
  noStroke();
  
  forgetMeNot = loadImage("IMG_9627.JPG");
  forgetMeNot.resize(1199, 787);

}

void getSerialData() {
  while (serialPort.available() > 0) {
    String in = serialPort.readStringUntil( 10 );  // 10 = '\n'  Linefeed in ASCII
    if (in != null) {
      print("From Arduino: " + in);
      String[] serialInArray = split(trim(in), ",");
      if (serialInArray.length == NUM_OF_VALUES_FROM_ARDUINO) {
        for (int i=0; i<serialInArray.length; i++) {
          arduino_values[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Arduino Code

/*

   Arduino and MPU6050 Accelerometer and Gyroscope Sensor Tutorial

   by Dejan, https://howtomechatronics.com

   with Arduino to Processing serial communication code added

   edits to lines 83, 84, 92, 93

*/

#include <Wire.h>

const int MPU = 0x68; // MPU6050 I2C address

float AccX, AccY, AccZ;

float GyroX, GyroY, GyroZ;

float accAngleX, accAngleY, gyroAngleX, gyroAngleY, gyroAngleZ;

float roll, pitch, yaw;

float AccErrorX, AccErrorY, GyroErrorX, GyroErrorY, GyroErrorZ;

float elapsedTime, currentTime, previousTime;

int c = 0;

void setup() {

  Serial.begin(9600);

  Wire.begin();                      // Initialize comunication

  Wire.beginTransmission(MPU);       // Start communication with MPU6050 // MPU=0x68

  Wire.write(0x6B);                  // Talk to the register 6B

  Wire.write(0x00);                  // Make reset - place a 0 into the 6B register

  Wire.endTransmission(true);        //end the transmission

  /*

  // Configure Accelerometer Sensitivity - Full Scale Range (default +/- 2g)

  Wire.beginTransmission(MPU);

  Wire.write(0x1C);                  //Talk to the ACCEL_CONFIG register (1C hex)

  Wire.write(0x10);                  //Set the register bits as 00010000 (+/- 8g full scale range)

  Wire.endTransmission(true);

  // Configure Gyro Sensitivity - Full Scale Range (default +/- 250deg/s)

  Wire.beginTransmission(MPU);

  Wire.write(0x1B);                   // Talk to the GYRO_CONFIG register (1B hex)

  Wire.write(0x10);                   // Set the register bits as 00010000 (1000deg/s full scale)

  Wire.endTransmission(true);

  delay(20);

  */

  // Call this function if you need to get the IMU error values for your module

  calculate_IMU_error();

  delay(20);

}

void loop() {

  // to send values to Processing assign the values you want to send

  // this is an example:

  //int sensor0 = analogRead(A0);

  //int sensor1 = analogRead(A1);

  // send the values keeping this format

  //Serial.print(sensor0);

  //Serial.print(",");  // put comma between sensor values

  //Serial.print(sensor1);

  //Serial.println();  // add linefeed after sending the last sensor value

  // === Read acceleromter data === //

  Wire.beginTransmission(MPU);

  Wire.write(0x3B); // Start with register 0x3B (ACCEL_XOUT_H)

  Wire.endTransmission(false);

  Wire.requestFrom(MPU, 6, true); // Read 6 registers total, each axis value is stored in 2 registers

  //For a range of +-2g, we need to divide the raw values by 16384, according to the datasheet

  AccX = (Wire.read() << 8 | Wire.read()) / 16384.0; // X-axis value

  AccY = (Wire.read() << 8 | Wire.read()) / 16384.0; // Y-axis value

  AccZ = (Wire.read() << 8 | Wire.read()) / 16384.0; // Z-axis value

  // Calculating Roll and Pitch from the accelerometer data

  accAngleX = (atan(AccY / sqrt(pow(AccX, 2) + pow(AccZ, 2))) * 180 / PI) - 0.58; // AccErrorX ~(0.58) See the calculate_IMU_error()custom function for more details

  accAngleY = (atan(-1 * AccX / sqrt(pow(AccY, 2) + pow(AccZ, 2))) * 180 / PI) + 1.58; // AccErrorY ~(-1.58)

  // === Read gyroscope data === //

  previousTime = currentTime;        // Previous time is stored before the actual time read

  currentTime = millis();            // Current time actual time read

  elapsedTime = (currentTime - previousTime) / 1000; // Divide by 1000 to get seconds

  Wire.beginTransmission(MPU);

  Wire.write(0x43); // Gyro data first register address 0x43

  Wire.endTransmission(false);

  Wire.requestFrom(MPU, 6, true); // Read 4 registers total, each axis value is stored in 2 registers

  GyroX = (Wire.read() << 8 | Wire.read()) / 131.0; // For a 250deg/s range we have to divide first the raw value by 131.0, according to the datasheet

  GyroY = (Wire.read() << 8 | Wire.read()) / 131.0;

  GyroZ = (Wire.read() << 8 | Wire.read()) / 131.0;

  // Correct the outputs with the calculated error values

  GyroX = GyroX + 0.56; // GyroErrorX ~(-0.56)

  GyroY = GyroY - 2; // GyroErrorY ~(2)

  GyroZ = GyroZ + 0.79; // GyroErrorZ ~ (-0.8)

  // Currently the raw values are in degrees per seconds, deg/s, so we need to multiply by sendonds (s) to get the angle in degrees

  //gyroAngleX = gyroAngleX + GyroX * elapsedTime; // deg/s * s = deg

  //gyroAngleY = gyroAngleY + GyroY * elapsedTime;

  gyroAngleX = 0.96 * gyroAngleX + 0.04 * accAngleX;

  gyroAngleY = 0.96 * gyroAngleY + 0.04 * accAngleY;

  yaw =  yaw + GyroZ * elapsedTime;




  // Complementary filter - combine acceleromter and gyro angle values

  //roll = 0.96 * gyroAngleX + 0.04 * accAngleX;

  //pitch = 0.96 * gyroAngleY + 0.04 * accAngleY;

  roll = gyroAngleX;

  pitch = gyroAngleY;




  // Print the values on the serial monitor

  Serial.print(roll);

  Serial.print(",");

  Serial.print(pitch);

  Serial.print(",");

  Serial.print(yaw);

  //Serial.println();

  Serial.print(",");

  Serial.print(AccX);

  Serial.print(",");

  Serial.print(AccY);

  Serial.print(",");

  Serial.print(AccZ);

  Serial.print(",");

  Serial.print(GyroX);

  Serial.print(",");

  Serial.print(GyroY);

  Serial.print(",");

  Serial.print(GyroZ);

  Serial.println();

  delay(20);

}

void calculate_IMU_error() {

  // We can call this funtion in the setup section to calculate the accelerometer and gyro data error. From here we will get the error values used in the above equations printed on the Serial Monitor.

  // Note that we should place the IMU flat in order to get the proper values, so that we then can the correct values

  // Read accelerometer values 200 times

  while (c < 200) {

    Wire.beginTransmission(MPU);

    Wire.write(0x3B);

    Wire.endTransmission(false);

    Wire.requestFrom(MPU, 6, true);

    AccX = (Wire.read() << 8 | Wire.read()) / 16384.0 ;

    AccY = (Wire.read() << 8 | Wire.read()) / 16384.0 ;

    AccZ = (Wire.read() << 8 | Wire.read()) / 16384.0 ;

    // Sum all readings

    AccErrorX = AccErrorX + ((atan((AccY) / sqrt(pow((AccX), 2) + pow((AccZ), 2))) * 180 / PI));

    AccErrorY = AccErrorY + ((atan(-1 * (AccX) / sqrt(pow((AccY), 2) + pow((AccZ), 2))) * 180 / PI));

    c++;

  }

  //Divide the sum by 200 to get the error value

  AccErrorX = AccErrorX / 200;

  AccErrorY = AccErrorY / 200;

  c = 0;

  // Read gyro values 200 times

  while (c < 200) {

    Wire.beginTransmission(MPU);

    Wire.write(0x43);

    Wire.endTransmission(false);

    Wire.requestFrom(MPU, 6, true);

    GyroX = Wire.read() << 8 | Wire.read();

    GyroY = Wire.read() << 8 | Wire.read();

    GyroZ = Wire.read() << 8 | Wire.read();

    // Sum all readings

    GyroErrorX = GyroErrorX + (GyroX / 131.0);

    GyroErrorY = GyroErrorY + (GyroY / 131.0);

    GyroErrorZ = GyroErrorZ + (GyroZ / 131.0);

    c++;

  }

  //Divide the sum by 200 to get the error value

  GyroErrorX = GyroErrorX / 200;

  GyroErrorY = GyroErrorY / 200;

  GyroErrorZ = GyroErrorZ / 200;

  // Print the error values on the Serial Monitor

  Serial.print("AccErrorX: ");

  Serial.println(AccErrorX);

  Serial.print("AccErrorY: ");

  Serial.println(AccErrorY);

  Serial.print("GyroErrorX: ");

  Serial.println(GyroErrorX);

  Serial.print("GyroErrorY: ");

  Serial.println(GyroErrorY);

  Serial.print("GyroErrorZ: ");

  Serial.println(GyroErrorZ);

}

Leave a Reply

Your email address will not be published. Required fields are marked *