Categories
Uncategorized

Touch me not- Final documentation

Touch Me Not- A melodic monster

In this project, I fused, technology, and art to craft an immersive experience. Utilizing Neopixels for LED visuals, real-time music analysis, and user interaction, the suspended monster-like creature engages users at eye level. This installation, triggered by touch and sound, aims to raise awareness about mental health.

The project plan involves audio signal processing research, Arduino prototyping, 3D modeling, and a Processing script for real-time visuals. Emphasizing engagement and personalization, the design prioritizes an intuitive interface and responsive interaction for a visually appealing yet somewhat disturbing encounter.

I divided cardboards into three pieces to form the rectangular body of the monster’s mouth gluing them together.

Next, I arranged my Neopixels inside the monster’s body and proceeded to code them, integrating them with a capacitive touch sensor for responsive interactions upon contact. The coding process proved to be exceptionally time-consuming.The code  uses the FastLED library to control 64 RGB LEDs. Three capacitive touch sensors, connected to pins 10, 6, and 4, trigger the animation when touched. The LEDs smoothly move back and forth, responding to touch by illuminating in a sweeping motion. The code manages animation states with conditional statements and controls the pace with delay functions. The status of touch sensors is displayed on the serial monitor. Overall, the code combines touch interaction and LED animation for an engaging and visually appealing experience.

 
 
I used a 3D printer to make custom ear and teeth designs. I found the designs on Thingiverse and made some edits. Then, I 3D printed them to add a physical and cool look to the project.
After finishing the Arduino code, I put it into Processing to show how touch and Neopixels work together.code works with an Arduino to create an interactive experience blending visuals and sound. It communicates with the Arduino using a specified serial port, reads data from three touch sensors, and uses this data to control dynamic triangles on the screen. The triangles move with random colors, responding to mouse interactions. The code also includes a sound file (“ghost.mp3”) that plays when touch sensors are activated, with the volume linked to the analyzed audio’s amplitude. The triangles change color randomly, adding visual diversity to the animated elements.I then changed the background 
Afterward, I sautered the touch sensors into the ears to give them an earring-like functionality.Following that, I laser cut a box to house the entire installation. To enhance the aesthetics, I concealed the wires by wrapping them with a black cloth and painted the inside of the box for a visually appealing finish.
The project had successes, with everything functioning well, except for the tooth touch sensor due to a loose wire. To enhance it, I could consider adding a servo motor to make the teeth move. Additionally, improvements could be made in the Processing visuals, aiming for better aesthetics or even introducing a game-like element.
 
Making this project was a tough but rewarding experience. I put in a lot of hard work, and the help from learning assistants and the professor was crucial. Their guidance was really valuable, helping me overcome challenges and make the project better.

 
 

Like all good things come to an end this project is no different. Here is a picture of the project dismantled and the items returned. 

Thank you to my professor and the LAs for all the support and guidance.