NOC Final – Aqauatic Reasonate
Link: https://editor.p5js.org/King-Raphael/sketches/iZmj9qIbb
I. Project Overview
From the first experiments with Perlin noise in p5.js, I felt the allure of simulating natural phenomena in code—a harmony between randomness and structure that feels almost magical. For this final project, titled Aquatic Resonate, developed as part of Nature of Code, I set out to build an interactive underwater ecosystem: luminous jellyfish that drift and pulse rhythmically, shrimp-inspired Boids whose segmented bodies sway like real crustaceans, and a sea of tiny vehicles tracing the hidden currents of a dynamic flow field. By weaving together vector physics, spring-mass systems, and real-time hand-pose detection via ml5.js, I sought to create a living painting where every gesture or click generates ripples of color and motion, inviting viewers into a captivating dance between algorithm and organism.

II. Work Process
My journey began with an immersive exploration into the foundational chapters of The Nature of Code, focusing especially on fluid dynamics, particle systems, and agent-based modeling. Early on, I sketched detailed prototypes on paper, envisioning the behaviors of my virtual marine life. The jellyfish were imagined pulsing forward gracefully, the shrimp envisioned with intricately swaying segmented bodies, and the particles represented as drifting plankton responding dynamically to hidden currents.

Initially, I constructed a grid-based flow field using Perlin noise, carefully adjusting parameters such as noise scale, update frequency, and flow resolution to find an ideal equilibrium between dynamic activity and visual tranquility. Ensuring that the flow remained visually coherent and continuously engaging involved extensive experimentation with incremental changes in the noise generation algorithm.

Next, I developed the Vehicle class to represent countless tiny particles traversing the generated flow field. I programmed each vehicle to follow the direction dictated by the flow field vectors and leave ephemeral luminous trails using an offscreen graphics buffer. These particles also interactively responded to mouse pointer movements, avoiding close proximity and creating an engaging user interaction.

Concurrently, the Jelly class took shape, embodying the gentle, rhythmic movements of real jellyfish. Each jellyfish instance was defined by a randomized base radius, mass, and hue, to ensure visual diversity. I implemented an algorithm that triggered movement using only the positive half of a sine wave, creating a pulsing motion reminiscent of natural jellyfish propulsion. Iterative fine-tuning of drag coefficients, pulse strength, and animation timing was crucial in achieving a buoyant and realistic swimming behavior. This part actually took my longest time to finish since I spent a lot of time drawing the jellyfish and make sure it’s beautiful to look.

The ShrimpBoid class was crafted next, using a more complex physical simulation involving a chain of interconnected spring-ball systems. By carefully varying the stiffness and damping properties of the springs along the shrimp’s body, I was able to replicate lifelike flexibility and subtlety in motion. Integration of Reynolds’s flocking rules—alignment, cohesion, and separation—allowed these virtual shrimps to swim harmoniously yet individually, forming fluid schools.

The coding part show on the above picture is also a struggling part for me. Because the shrimp will sometimes go over the border and it has a feature of spring, at that time its body will be streched into a very long line across the whole canvas, which is very distracted for the whole experience. After talking to Professor Moon, he suggested me to count the number of the body of shrimp and then it will display only when all the body is in the canvas and it succeeded at last.

To elevate the project’s interactivity, I incorporated ml5.handPose(), harnessing the potential of hand-tracking technology to translate real-world gestures into virtual ocean disturbances. Mapping fingertip and wrist coordinates onto the canvas enabled natural user interactions such as creating expanding ripples, which influenced nearby particles, and generating drifting bubbles upon specific gesture cues. This aspect of the project required meticulous calibration and smoothing algorithms to ensure responsiveness without jitter or latency.

Throughout the development process, performance optimization remained a critical priority. By strategically managing object counts (e.g., limiting vehicles to 1,500, jellyfish to six, and shrimp Boids to ten) and implementing efficient rendering techniques, such as utilizing offscreen buffers, I maintained consistently smooth frame rates even under heavy computational loads.
III. Technical Challenges & Key Insights
Balancing numerical stability with visual liveliness emerged as a significant challenge. Frequent recalculations of the flow field introduced unwanted jitter, whereas infrequent updates risked stagnation. Through trial and error, I discovered a suitable balance by recalculating every FLOW_STEP frames, providing gentle and continuous flow changes.
The jellyfish’s pulsing algorithm required precision. Employing an envelope detection mechanism that only activated thrust at specific points in the sine wave cycle effectively prevented unnatural overlapping pulses. For shrimp Boids, varying spring stiffness progressively from head to tail created realistic flexibility, while strategic damping prevented oscillations from becoming erratic.

Integrating hand-pose detection posed unique technical hurdles, primarily related to responsiveness and smoothness. Implementing noise filtering and threshold controls significantly improved interaction fluidity, enabling users to effortlessly generate precise ripples and bubbles that harmoniously interacted with the ecosystem’s elements.
IV. Aesthetic Implementation
To evoke a vivid underwater dreamscape, careful attention was given to color dynamics, transparency effects, and blending modes. Using the HSB color space, jellyfish radiated gentle purples and pinks, contrasted by particles in complementary blues and greens. Shrimp Boids transitioned elegantly from magenta hues at their heads to subtle teal gradients towards their tails, with brightness and opacity tied to their velocity.
By drawing particle trails onto a separate graphics buffer and employing additive blending (blendMode(ADD)), each movement left behind glowing afterimages, enhancing the sense of depth and dynamism. Jellyfish and shrimp bodies were articulated with curveVertex and modulated using Perlin-noise offsets, achieving fluid, lifelike motion.
Interactive elements such as pointer-induced ripples expanded in softly glowing concentric circles, while bubbles generated by hand gestures ascended gracefully, gradually dissipating to maintain the ethereal aesthetic.

V. Reflections & Lessons Learned
This project underscored the profound impact of emergent behaviors arising from simple rules and parameters. Even subtle adjustments in numerical settings could dramatically alter the ecosystem’s visual and behavioral authenticity. Managing computational resources effectively—through offscreen rendering and object count optimization—proved essential in preserving smooth performance.
Integrating gesture detection via ml5.handPose opened exciting avenues for natural and intuitive interaction, while simultaneously highlighting challenges in responsiveness and stability. Ultimately, this experience enhanced my appreciation for the nuanced artistry inherent in creative coding, reinforcing the delicate interplay between technical precision and aesthetic sensibility.

VI. Future Outlook
Moving forward, I envision enriching Aquatic Resonate with interactive audio elements, transforming visual movements into dynamic soundscapes for an immersive multisensory experience. Implementing user-configurable interfaces through tools like Tweakpane could further encourage exploration and personalization. Additionally, integrating AI-driven behaviors to guide the movements and interactions of ecosystem elements promises exciting possibilities for deeper engagement. Ultimately, transitioning this simulation into immersive installations or VR environments could profoundly expand its experiential impact, inviting participants into an interactive, digital aquatic universe.


And I would like to thank Professor Moon a lot! He really helped me conquer many difficult problems and give me inspirations in the process of making the project
