IALC P5 Live Tutorial

I. Setup Guide

You can start with looking at the menu: there are mainly four parts in the menu: P5 Live includes most important functions such as settings and references; Cocoding allows you to cooperate with others like Flok, you can even share the effect(screen) by running the codes and let others see the same motion; Sketchs lets you create new file/folder, upload or export the sketches; Recording let you can record the whole live coding process.

Menu 1
Menu 1
Menu 2
Menu 2

II. First Example: “Hello World” in P5 Live – Audio Analysis

/*	
	_audio_analysis // cc teddavis.org 2019-24
	
	revamp of P5LIVE's Audio snippet, now built-in the background!
	add snippet to other sketches: CTRL + SHIFT + A 
*/

function setup() {
	createCanvas(windowWidth, windowHeight)

	// audio stuff now behind the scenes, 'true' makes class vars global
	setupAudio(true) // if empty, use 'a5.' before audio vars below
	// a5.ease = .075 // customize ease speed
}

function draw() {
	/* audio vars: amp, ampL, ampR, ampEase, fft, fftEase, waveform, waveformEase */
	updateAudio()

  background(0)
	noFill()
	stroke(255)
	textAlign(CENTER, CENTER)

	/* average */
	text("MIX", width * .5, height / 4)
	ellipse(width / 2, height / 4, amp)
	text("L", width * .25, height / 4)
	ellipse(width * .25, height / 4, ampL)
	text("R", width * .75, height / 4)
	ellipse(width * .75, height / 4, ampR)
)

	/* waveformEase */
	beginShape()
	for(let i = 0; i < waveformEase.length; i++) {
		let freq = waveformEase[i] * height / 4 // (-1, 1)
		let x = map(i, 0, waveformEase.length, 0, width)
		curveVertex(x, height * .5 + freq)
	}
	endShape()

	/* fft */
	for(let i = 0; i < fft.length; i++) {
		let freq = fft[i]; // (0, 255)
		let x = map(i, 0, fft.length, 0, width)
		let w = width / fft.length
		rect(x, height * .8, w, -freq)
	}

	/* fftEase */
	for(let i = 0; i < fftEase.length; i++) {
		let freq = fftEase[i]; // (0, 255)
		let x = map(i, 0, fftEase.length, 0, width)
		let w = width / fftEase.length
		rect(x, height * .805, w, freq)
	}
}

What This Code Does:

  • Audio Initialization: Uses setupAudio(true) to initialize and make audio variables globally accessible.

  • Audio Update: Every frame calls updateAudio() to update variables such as amplitude (amp), left/right channels (ampL, ampR), waveform, and FFT analysis arrays.

  • Visual Display:

    • It draws text labels (“MIX”, “L”, “R”) and circles whose sizes reflect the average amplitude.

    • The waveform is drawn with smooth curves using curveVertex().

    • FFT bars are drawn to display the frequency spectrum.

III. Intermediate Example: Interactive Audio-Visual Sketch

Once you’re comfortable with the basic audio analysis example, you might want to create something more creative. For instance, you can combine text point matrices, dynamic backgrounds, and interactive audio responses. See the advanced example below:

P5 Live Tutorial 1
P5 Live Tutorial 1
P5 Live Tutorial 2
P5 Live Tutorial 2
P5 Live Tutorial 3
P5 Live Tutorial 3

Below is a detailed explanation of the intermediate example’s features. This example combines multiple creative and interactive elements that work together to produce a dynamic audio-visual experience:


1. Dynamic Gradient Background

  • Purpose:
    The gradient background sets a continuously evolving visual stage that adds depth and ambiance to the sketch. It provides a constantly shifting color palette that subtly changes over time. If you are interested, I think it can be explored to connect the chaging rates with frequency of the sound.

  • How It Works:

    • A dedicated function (commonly called drawGradient()) uses p5.js’s HSB color mode and linear interpolation between two colors.

    • The hue value is incremented slightly each frame, causing the colors to smoothly transition.

    • For each horizontal line of pixels on the canvas, the code calculates an interpolated color and draws a line across the canvas. This creates a smooth gradient effect that covers the entire background.

  • Visual Impact:
    The gradient not only adds an aesthetic touch but also serves as a dynamic backdrop that contrasts with the other moving elements in the sketch.


2. Rotating Rectangle with Ghost (Trail) Effect

  • Purpose:
    The rotating rectangle acts as an audio-reactive visual element that responds to incoming sound levels. The ghost or trail effect gives the impression of motion continuity, as past positions linger briefly before fading out.

  • How It Works:

    • A separate graphics layer (using p5.Graphics) is created and dedicated solely to drawing the rectangle.

    • Each frame, before drawing the current state of the rectangle, the layer is overlaid with a semi-transparent black rectangle. This gradually dims previous drawings, resulting in a trailing “ghost” effect.

    • The rectangle is drawn at the center of the canvas, rotated based on the frame count. The rotation angle increments gradually, causing the rectangle to spin over time.

    • Its stroke weight and size are modulated by a smoothed audio amplitude value (retrieved from an audio meter). This links the visual change to the volume of the audio input, so louder sounds cause a bolder or larger rectangle.

  • Visual Impact:
    The ghosting of the rotating rectangle creates a sense of motion history, making the visual element feel more fluid and responsive. The intertwining of rotation with audio-driven changes adds an engaging rhythm and energy to the composition.


3. Audio Analysis and Reaction

  • Purpose:
    The sketch uses audio data to create interactivity in its visuals. Sound levels guide the behavior of the rotating rectangle and can influence other animated components.

  • How It Works:

    • Tone.js (or a similar audio library) is used to capture real-time audio data. Functions like updateAudio() read in values such as the overall amplitude, channel-specific amplitudes, waveform data, and FFT (Fast Fourier Transform) data.

    • This audio information is mapped and smoothed (using linear interpolation) to produce variables like amp and ampEase. These variables control the size and stroke weight of the rotating rectangle.

    • The direct connection between audio amplitude and visual parameters establishes a clear, intuitive link between sound and motion. For example, as the audio becomes louder, the rectangle may enlarge, making the visual response immediately perceptible.

  • Visual Impact:
    Real-time audio analysis turns the sketch into an interactive performance, where changes in sound directly translate to visual transformations. This creates an engaging, multisensory experience.


4. Text Point Matrix (Interactive Sound Triggers)

  • Purpose:
    The text point matrix functions as both a visual and interactive element. It displays text (e.g., “P5LIVE”) as a series of points that can react to audio and user interaction.

  • How It Works:

    • A font is loaded and processed with the textToPoints() function to convert the text into an array of coordinate points. This array is stored and later used to draw individual points on the canvas.

    • Each point is enhanced with extra properties—such as color, pulse (for a “bouncing” effect), and a flag to indicate whether it’s active or “playing.”

    • The points are animated with slight positional offsets (using a noise function) to simulate a gentle jitter. This gives the effect of organic, living text.

    • A vertical “playhead” (a moving line) sweeps across the canvas. When the playhead comes near an active point, that point triggers an audio sample (synthesized sound) and briefly enlarges or pulses.

    • The activation can be randomized initially or toggled via mouse interaction, allowing for direct user control over which points produce sound.

  • Visual Impact:
    The text point matrix offers a visually engaging method to display text. Its interactive nature—where sound is triggered by the playhead’s contact—blends typography with live performance dynamics. Additionally, connecting active points with smooth curves creates a network of lines, adding another layer of visual intrigue.


5. Connecting Lines Between Activated Points

  • Purpose:
    The connecting lines emphasize the interaction between individual text points, visually grouping those that are active at any given moment.

  • How It Works:

    • Activated points (those that are “playing”) are filtered from the text point array.

    • A sampling technique (e.g., selecting every second point) reduces the number of points used to draw the connecting curve, avoiding visual clutter.

    • The beginShape(), curveVertex(), and endShape() functions are then used to draw a smooth, continuous curve that passes through the selected points. This is a common used way in p5.js.

  • Visual Impact:
    This feature creates an abstract network or web-like structure that evolves with the audio and interactive events. It enhances the overall visual complexity and gives the impression of an interconnected dynamic system.


Each of these features is carefully interwoven to produce an interactive, audio-responsive sketch. The fluid interplay between the shifting background, the reactive rotating rectangle, and the dynamic text point matrix results in a rich, multisensory performance that is both visually captivating and technologically intriguing.

IV. Tips & Troubleshooting

  • Audio Initialization:
    Tone.js (or your audio analysis library) might require a user interaction to start the audio context. If no sound or audio data is detected, try clicking the canvas first.

  • Asset Paths:
    Ensure the file paths for fonts and audio samples (e.g., "includes/demos-data/fonts/RobotoMono-Regular.otf") are correct relative to your project structure.

  • Library Loading Errors:
    Check your browser’s console for errors. A missing script or resource will be flagged here.

  • Performance:
    If performance slows down, reduce the sampling density or simplify some visual elements, especially if running on lower-end devices.

  • Debugging:
    Use console.log(variable) statements to inspect values of audio variables like amp or waveform for troubleshooting.

V. Resources

VI. Personal Reflection

Working on these projects has been incredibly inspiring. The first “Hello World” example using real-time audio analysis opened my eyes to how code can transform sound into visuals, making even a simple sketch feel alive. As I experimented further, I realized that creative coding is a wonderful blend of art and technology—each tweak in the code creates a new dynamic experience. And I can use my previous knowledge learned in cclab and nature of code because they are all p5.js based. I hope this tutorial sparks your creativity and helps you start your own journey with P5 Live and interactive audio!


Leave a Reply

Your email address will not be published. Required fields are marked *