MLNI – Final Project – Time Machine

Project Proposal:

Video Demo for the Project Proposal: https://youtu.be/vJQMxQdU4fM

For the final project, my initial idea is to create an interactive installation that allows the audience to have the experience of controlling the time. I want to create an avatar (not decided yet) to represent the audience as a time machine. The audience’s two hands are the “steering wheel” of the time machine to control the direction of the flow of time, and when you rotate your left hand, the time flows forward, while rotating the right hand, the time flows backwards. The speed of the time is determined by your hands’ position. The higher you put your hand, the faster the time would pass. I plan to use KNN classification to detect the audience’s different hands position, and use Object-Oriented Programming to create displayed objects (maybe the electrocardiography) that indicate the flow of time.

The minimum viable product would be to generate some objects according to the hands position, but without the function of changing the speed. The stretch goal is to provide more different objects and more approaches to play with the flow of time.

For conceptual development, I have two inspirations. One is called Docomo Palette UI, and another one is Digital Doodle. Docomo Palette UI uses a phone as the interactive device for the audience to play with the project. By scrolling up and down the screen, the audience can emit or retrieve objects like fumes, paper butterflies, ribbons, letters and Tetris. Companied by the specific sound effect, it provides the audience with the experience of scrolling a timeline to observe how objects evolve. Digital Doodle provides a visual effect like time travel. I want to use Digital Doodle as my inspiration for the displayed background, and use Docomo Palette UI as my reference for the display effect of objects. Instead of only indicating the flow of time, I want to further extend the concept of Docomo Palette UI and create something alive or something that indicates life to better demonstrate the flow of time and the meaning of time for life.

Docomo Palette UI Good Design
Docomo Palette UI Good Design
Digital Doodle

Implementation Plan:

  • Create the background like Digital Doodle: use OOP to randomly generate ellipses in different colors and different coordinates on the canvas, also set an accelerated speed for those ellipses to enhance the effect of “time travel”.
  • Visualize all of the objects that will be displayed: try to generate electrocardiography, flowers, or other things that can indicate life.
  • Implement KNN classification to detect the gestures: set different scenarios to control the flow of time, like the speed and direction.
  • Consider further conceptual ideas to improve the interaction.

Learning Goals:

I want to learn more approaches to play with OOP and learn how to code for different displayed effects. Also, throughout the whole process of creating this project, I want to have a deeper reflection on the concept and the deeper meaning of my project, and try to improve the project with more implications.

Implementation Process:

At first, I tried to mimic the visual effect of Digital Doodle and created a scene of time travel. But the result turned out not so satisfying, so I changed to display a day-and-night transition to indicate the flow of time. Basically, I utilized two png pictures for the sun and the moon. And I set a center position and x and y radiuses for the path of their movements. I also used sin() and cos() to make sure that the path is an oval. The reference is here.

Sun
Moon
Code for Sun and Moon (1)
Code for Sun and Moon (2)
Code for Sun and Moon (3)
Code for Sun and Moon (4)

After completing the visual effect for the background, I continued to build the main objects. With several attempts, I found that my original idea of generating electrocardiography was not quite feasible, so I decided to use only one tree to represent the passage of time. First, I borrowed an open source code that could generate trees with a very cool and vivid looking. It also has the generative process and a waving movement.

However, the code was a bit hard to grasp and since I needed to know how to control its growth, I had to fully understand how it generates the tree. But I failed to do so. So Professor Moon taught me another way to generate the tree, and this way is much better and easier for me to understand and master. The sample code from Professor Moon is here. Basically, it uses createVector() (manipulating the angles of branches) and a recursive function (calling a function within itself) to generate branches in random angles so that it looks like a tree. I can also manipulate the stroke weight of the branches to make it more vivid. 

Sample Recursive Tree

After settling down the shape of the tree, I started writing the code for the petals to be generated at the vertices of the tree branches. Before positioning them onto the branches, I began with generating them on random positions. The reference for the petals’ movement is here. The petals fall along the path of a sine function, which gives a feeling of falling with the wind. Based on the original code, I made a few changes and turned it from a function into a class. I also set an ending position and a life span for the petals so that when they touch the ground, they will fade away and disappear. 

Code for the Petals (1)
Code for the Petals (2)
Code for the Petals (3)
Code for the Petals (4)
Code for the Petals (5)

Then I combined the tree code and the petals code by applying the petals onto the branches. To do so, I created another class for generating the branches. Professor Moon also taught me to use lerp() (sample code is here) for linear interpolation so that I can control the emerging process of the branches. Also, by pushing the petals’ class inside the branches’ class, I managed to generate petals on the branches. 

Code for Branch Class (1)
Code for Branch Class (2)
Code for Branch Class (3)
Code for Branch Class (4)
Code for Branch Class (5)

For user test, I didn’t apply any machine learning models to control the branches yet. Instead, I used the mouse position to prototype and see the outcome. By mapping the x and y position of the mouse, users can move the mouse from left to right to see how the tree and flowers grow, and they can change the y position of the mouse to change the color of the flowers. Also, I discarded the sin() movement of the petals and only kept it falling straight.

Code for Mouse Mapping (1)
Code for Mouse Mapping (2)
Code for Mouse Mapping (3)
Code for Mouse Mapping (4)
Code for Mouse Mapping (5)

During the user test, I received a lot of helpful feedbacks including suggestions about the visual effects and the interactions. Then I started to apply KNN Classification with PoseNet for interaction. At this stage, I turned the background to all black so that I can see the tree’s transition more clearly. However, since the tree was static and only had one color, the visual effect was not satisfying. So I turned to Professor Moon for help. And thanks to Professor Moon’s help, the tree turned out to be much prettier. The thickness of the tree’s branches is constantly getting thinner as it grows, which makes the tree look more realistic.
 The tree branches also have a gradient ramp of rainbow colors and the effect of soft vibration (using noise()). For the petals, there are also four stages for their positions – hidden, grow, stay, and fall. This version with KNN Classification was also displayed on the IMA Show.

Code for Tree’s New Effects (1)
Code for Tree’s New Effects (2)
Code for Tree’s New Effects (3)
Code for Tree’s New Effects (4)
Code for Tree’s New Effects (5)
Code for Different Stages of Petals (1)
Code for Different Stages of Petals (2)
Code for Different Stages of Petals (3)
Code for Different Stages of Petals (4)

Here is a video about me playing with it during the IMA Show:

The biggest problem I discovered during the IMA Show was that the KNN Classification was not so accurate to detect different body gestures, which made the interactive experience dissatisfying. Also, if people wear clothes in dark colors, the KNN Classification works better than those in light colors. So for the final presentation of the project, I changed to PoseNet so that users can interact with their right hand position. I also drew an ellipse on the canvas to indicate the position of users’ right hand. The final result became better and accurate now. 

Code for PoseNet (1)
Code for PoseNet (2)
Code for PoseNet (3)

Here is the link for the project on Glitch: https://time-machine-2.glitch.me

Here is the link for the project presentation: here

Reflection:

What I Struggled With:

  • The visual effects: When I was brainstorming for this project, I thought about a lot of visual effects to display. However, since I want to control those objects to grow and also reverse the process, it seems unachievable due to my current ability. So I chose to only display a tree and try to improve its visual effects.
  • The interaction with the audience: For the current status, I only managed to provide three ways of interactions, which seems a little bit insufficient and boring. The KNN Classification can allow me to create more interesting gestures to trigger those interactions, but its accuracy is not satisfying. The PoseNet is accurate enough, but I haven’t come up with feasible but still interesting interactions for it.
  • The accuracy of the model: At first, I adopted the KNN Classification and PoseNet to detect the audience’s gestures. However, it turned out to be not so accurate during the IMA Show. So I changed it to only use PoseNet to detect the right hand’s position.
  • The interpretation of the concept: Currently, the day and night visual effects and the tree’s growth have demonstrated my concept of time machine. However, I’m also considering bringing more objects to display and thinking of new ways to better illustrate the core concept.

What I Have Learned:

  • Calling a class within a class: At first, I thought that I can only call a function in itself and cannot call a class in itself. But, as the professor assisted me to build the project and create the branches of the tree, I learned how to call a class in itself, which is of great help to not only this project but also my future learning.
  • Create continuous movement and reverse it back: I learned that the lerp() function can be used to control the position of an object within its path from a starting point to an end point. This allows me to manipulate the object’s movement.
  • Other useful functions: createVector(), noise(), and constrain().

How I would Develop My Project Further:

  • On visual effects: I will come up with more demonstrable objects to illustrate the concept about controlling the time.
  • On interaction: Try to explore other models that can allow me to implement more interesting interactions correctly. Also, think of new ways of manipulating the time and the objects.

Attributions:

Leave a Reply

Your email address will not be published. Required fields are marked *