Orchestrate! – Crystal Lin – Rodolfo Cossovich

CONCEPTION AND DESIGN

Our final project, Orchestrate!, underwent significant transformations from the planning stage to its completion. At its core, my partner, Emma, and I aimed to enable users to translate music into visual art. We deliberated extensively, exploring various concepts ranging from GarageBand’s collaborative audio creation to interactive drawing manipulations. Ultimately, we settled on a vision where the frequency and amplitude of music would directly influence the size and color of the shapes users draw.

Initially, we envisioned users mimicking instrument-playing actions to generate music and visuals, like hitting a gavel for drums or blowing a pinwheel for brass instruments. However, user testing revealed a preference for a more artistic presentation, leading us to prioritize simplicity and elegance over complex wiring.

As a solution, we adopted a infrared distance sensor, allowing users to intuitively move a piece up and down to control both their drawing gestures and the music’s volume. Additionally, feedback suggested cohesive audio recordings would enhance the experience. Faced with the challenge of finding individual recordings that harmonized well, we opted to integrate a complete orchestra score from MuseScore. Each user interaction now triggers a specific instrument part, transforming the experience into a virtual orchestra conducting session.

In essence, Orchestrate! evolved into an immersive experience where users become “conductors”, orchestrating each instrument to compose their visual masterpiece.

FABRICATION AND PRODUCTION

As mentioned earlier, Emma and I aimed to create an experience where users could transform music into visual art.

Initially, our project consisted of six components: drums, guitar, bass, keyboard, microphone, and brass instruments. Each instrument was associated with a corresponding object: a gavel, rice in a toy capsule, dried beans in a can, elevator buttons, a megaphone cone, and a pinwheel. Additionally, we planned for each instrument to produce distinct shapes and colors: red circles for drums, orange waves for the acoustic guitars, yellow triangles for bass, green spirals for electric guitars, blue ellipses for the microphone, purple quadrilaterals for brass instruments, and rainbow dotted lines for the keyboard. 

But, we realized that the microphone would be difficult to use as it would pick up too much background noise. Therefore, we decided to remove it from the lineup.  

Initially, we focused on the shapes and their interaction with the music, with frequency dictating color and amplitude determining size. Emma tackled circles, quadrilaterals, and triangles, while I tackled spirals, waves, and dotted lines.

Code for the drums:

void drawDrumEffect (float amplitudeDrum) {
  for (int iDrum = 0; iDrum < numCircles; iDrum++) {
    float xDrum = random(width);
    float yDrum = random(height);
    //float amplitudeDrum = player.mix.level();  // Get amplitude
    float circleSize = map(amplitudeDrum, 0, 1, 0, 200);  // Map amplitude to circle size

    float drumFreq = map(drum.left.get(iDrum % drum.left.size()), -1, 1, 20, 20000); // Get frequency

    float redDrum = map(drumFreq, 20, 20000, 0, 255); // Map frequency to color
    float greenDrum = 0;
    float blueDrum = 0;

    fill(redDrum, greenDrum, blueDrum);
    ellipse(xDrum, yDrum, circleSize, circleSize);
  }
}

Code for bass:

void drawBassEffect(float amplitudeBass) {
  for (int iBass = 0; iBass < numTriangles; iBass++) {
    float xBass1 = random(width);
    float yBass1 = random(height);
    float xBass2 = xBass1 + amplitudeBass*100;
    float yBass2 = yBass1 + amplitudeBass*100;
    float xBass3 = xBass1 + amplitudeBass*100;
    float yBass3 = yBass1 + amplitudeBass*100;

    float bassFreq = map(bass.left.get(iBass % bass.left.size()), -1, 1, 20, 20000); // Get frequency

    float redBass = 0;
    float greenBass = map(bassFreq, 20, 20000, 0, 255); // Map frequency to color
    float blueBass = 0;
    
    fill(redBass, greenBass, blueBass);

    triangle(xBass1, yBass1, xBass2, yBass2, xBass3, yBass3);
  }
}

Code for brass:

void drawBrassEffect(float amplitudeBrass) {
  for (int iBrass = 0; iBrass < numQuad; iBrass++) {
    float xBrass1 = random(width);
    float yBrass1 = random(height);
    float xBrass2 = xBrass1 + random(-50, 50); // Adjust the coordinates of the other vertices
    float yBrass2 = yBrass1 + random(-50, 50); // Adjust the coordinates of the other vertices
    float xBrass3 = xBrass2 + random(-50, 50); // Adjust the coordinates of the other vertices
    float yBrass3 = yBrass2 + random(-50, 50); // Adjust the coordinates of the other vertices
    float xBrass4 = xBrass1 + random(-50, 50); // Adjust the coordinates of the other vertices
    float yBrass4 = yBrass1 + random(-50, 50); // Adjust the coordinates of the other vertices

    float brassFreq = map(brass.left.get(iBrass % brass.left.size()), -1, 1, 20, 20000); // Get frequency

    float redBrass = 0;
    float greenBrass = 0;
    float blueBrass = map(brassFreq, 20, 20000, 50, 255); // Map frequency to color
    
    fill(redBrass, greenBrass, blueBrass);

   quad(xBrass1, yBrass1, xBrass2, yBrass2, xBrass3, yBrass3, xBrass4, yBrass4); // Draw quadrilateral
  }
}

Code for piano:

void drawLines(float Pianoamplitude) {
  // Set line attributes
  float PianodotSize = map(Pianoamplitude, 0, 1, 1, 50);
  stroke(random(255), random(255), random(255));
  strokeWeight(PianodotSize); // Adjust the multiplier to make the lines thicker

  // Random coordinates for starting point
  float Pianox1 = random(width);
  float Pianoy1 = random(height);
  
  // Random direction for ending point
  float Pianoangle = random(TWO_PI); // Random angle in radians
  float PianosegmentLength = map(Pianoamplitude, 0, 1, 10, 100); // Adjust the range (10, 100) for longer lines
  float Pianox2 = Pianox1 + cos(Pianoangle) * PianosegmentLength; // Calculate x-coordinate based on angle and segment length
  float Pianoy2 = Pianoy1 + sin(Pianoangle) * PianosegmentLength; // Calculate y-coordinate based on angle and segment length
  
  // Draw the line with larger gaps between dots
  float Pianogap = 20; // Adjust gap between dots
  float Pianodistance = dist(Pianox1, Pianoy1, Pianox2, Pianoy2);
  float Pianodx = (Pianox2 - Pianox1) / Pianodistance;
  float Pianody = (Pianoy2 - Pianoy1) / Pianodistance;
  
  // Decrease the number of lines drawn
  int PianonumLines = 10; // Adjust the number of lines
  float PianocurrentLength = 0;
  for (int i = 0; i < PianonumLines && PianocurrentLength < Pianodistance; i++) {
    float PianoxEnd = Pianox1 + Pianodx * min(Pianogap, Pianodistance - PianocurrentLength);
    float PianoyEnd = Pianoy1 + Pianody * min(Pianogap, Pianodistance - PianocurrentLength);
    line(Pianox1, Pianoy1, PianoxEnd, PianoyEnd);
    Pianox1 = PianoxEnd + Pianodx * PianodotSize;
    Pianoy1 = PianoyEnd + Pianody * PianodotSize;
   PianocurrentLength += Pianogap + PianodotSize;
  }
}

Code for electric guitar:

void drawSpiral(float Electricamplitude) {
  // If drawNewSpiral is true, draw a new spiral
  if (drawNewSpiral) {
    // Analyze the song

    float Electricfrequency = Electricplayer.left.level(); // Get frequency
    // Draw spirals based on amplitude
    float angleStep = map(Electricamplitude, 0, 1, 0.01, 0.1); // Adjust angle step based on amplitude
    electricRadius += 0.1; // Increment radius

    float x = electricRadius * cos(electricAngle);
    float y = electricRadius * sin(electricAngle);

    // Ensure the spiral stays within the screen boundaries
    if (abs(x) > width/2 || abs(y) > height/2) {
      // If the spiral goes beyond the screen, stop drawing
      drawNewSpiral = false;
    }

    // Smooth the frequency
    float smoothing = 0.9; // Smoothing factor
    float smoothedFrequency = 0;
    smoothedFrequency = smoothing * smoothedFrequency + (1 - smoothing) * Electricfrequency;

    strokeWeight(4); // Adjust the thickness of the drawing
    stroke(spiralColor); // Set the color for the entire spiral

    // Add the current point to the points list
    points.add(new PVector(x + center.x, y + center.y)); // Add the offset

    // Draw the spiral
    for (int i = 1; i < points.size(); i++) {
      PVector prev = points.get(i-1);
      PVector current = points.get(i);
      line(prev.x, prev.y, current.x, current.y);
    }
    // Update angle for next frame
    electricAngle += angleStep;
  }
}

Code for acoustic guitar:

void AcousticanalyzeAmplitude() {
  if (!wavesPaused) {
    float[] AcousticleftChannel = Acousticplayer.left.toArray(), AcousticrightChannel = Acousticplayer.right.toArray();
    float AcoustictotalAmplitude = 0;
    for (int i = 0; i < AcousticleftChannel.length; i++) AcoustictotalAmplitude += abs((AcousticleftChannel[i] + AcousticrightChannel[i]) / 2);
    AcoustictotalAmplitude /= AcousticleftChannel.length;
    mappedAmplitude = map(AcoustictotalAmplitude, 0, 1, 20, 200);
    for (int i = 0; i < circleXList.size(); i++) {
      float circleX = circleXList.get(i), circleY = circleYList.get(i), directionX = directionXList.get(i), directionY = directionYList.get(i);
      circleX += directionX * 2;
      circleY += directionY * sin(frameCount * 0.1) * mappedAmplitude;
      circleX = constrain(circleX, 0, screenW);
      circleY = constrain(circleY, 0, screenH);
      circleXList.set(i, circleX);
      circleYList.set(i, circleY);
    }
  }
}

void drawWaves() {
  if (!wavesPaused) {
    for (int i = 0; i < pathList.size(); i++) {
      ArrayList<PVector> path = pathList.get(i);
      int waveColor = waveColorList.get(i);
      float circleX = circleXList.get(i), circleY = circleYList.get(i);
      path.add(new PVector(circleX, circleY));
      noFill();
      stroke(waveColor);
      strokeWeight(2);
      beginShape();
      for (int j = 0; j < path.size(); j++) {
        PVector p = path.get(j);
        float xAcoustic = Acousticp.x;
        float yAcoustic = Acousticp.y + sin(frameCount * 0.1 + j * 0.1) * 50;
        curveVertex(xAcoustic, yAcoustic);
      }
      endShape();
    }
  }
}


void createWave(float startX) {
  float initialCircleY = random(height), initialDirectionX = startX == 0 ? 1 : -1, initialDirectionY = random(-1, 1);
  circleXList.add(startX);
  circleYList.add(initialCircleY);
  directionXList.add(initialDirectionX);
  directionYList.add(initialDirectionY);
  pathList.add(new ArrayList<PVector>());
  waveColorList.add(color(random(255), random(255), random(255)));
}

Crafting the shapes turned out to be the most time-consuming part for me. Since there weren’t any existing functions to draw them, I had to create each shape separately before incorporating them into the drawings. This process ended up taking more time than we initially anticipated. However, once we completed this stage, we were able to shift our focus to the construction and mechanics of Orchestrate!

Then, for the global variables, setup(), and draw() functions on the Processing side, we wrote:

import processing.serial.*;
import ddf.minim.*;
import ddf.minim.analysis.*;
import java.util.ArrayList;

//electric guitar
float electricAngle = 0;
float electricRadius = 20;
float smoothing = 0.9; // Smoothing factor
int currentWaveIndex = 0; // Keep track of the current wave being drawn
int ElectricbuttonState;
color spiralColor; // Define color variable
PVector center; // Define the center position for the spiral
boolean drawNewSpiral = true; // Flag to indicate whether to draw a new spiral
boolean prevButtonStateElectric = false;
boolean startNewSpiral = false;
ArrayList<PVector> points = new ArrayList<PVector>();

//piano
boolean musicStarted = false; // Flag to track whether music has started
boolean drawingPaused = false; // Flag to indicate whether drawing is paused
boolean buttonPressed = false; // Flag to track button state
int lastButtonPressTime; // Variable to store the time of the last button press
int currentLineIndex = 0; // Keep track of the current line being drawn
int prevButtonState = 0; // Previous button state
//int LinefadeDuration = 5000; // Fade duration in milliseconds (5 seconds)
int initialTime; // Initial time to track the start of the program
ArrayList<DottedLine> lines = new ArrayList<DottedLine>();
ArrayList<Integer> LinesCreationTime = new ArrayList<>(); // List to store the creation time of each wave

// acoustic guitar
int AcousticnewButtonState;
int WavefadeTimer = 0; // Timer to control fading
int WavefadeDuration = 30 * 60; // 30 seconds * 60 frames per second (adjust as needed)
boolean wavesPaused = true;
boolean waveCreated = false;
boolean AcousticprevButtonState = false; // Moved outside of draw() function
boolean firstButtonPress = true; // Track if it's the first button press
boolean previousWavesPaused = true;
boolean buttonProcessedThisFrame = false; // Flag to track if a button press has been processed in the current frame
float mappedAmplitude = 0;
float AcoustictotalAmplitude = 0;
ArrayList<ArrayList<PVector>> pathList = new ArrayList<>();
ArrayList<Integer> waveColorList = new ArrayList<>();
ArrayList<Float> circleXList = new ArrayList<>(), circleYList = new ArrayList<>(), directionXList = new ArrayList<>(), directionYList = new ArrayList<>();
ArrayList<Integer> waveCreationTime = new ArrayList<>(); // List to store the creation time of each wave


BeatDetect beatBass, beatDrum, beatBrass;
Serial serialPort;
Minim minim;
AudioPlayer drum;
AudioPlayer bass;
AudioPlayer brass;
AudioPlayer electric;
AudioPlayer piano;
AudioPlayer acoustic;

float sensorValueBass = 0;
float sensorValueDrum = 0;
float sensorValueBrass = 0;
float sensorValueAcoustic = 0;
float sensorValueElectric = 0;
float sensorValuePiano = 0;

boolean timerActive = false;
int startTime = 0; // Time when sensors first went below 30
float fadeLevel = 0; // Opacity level of the white overlay

void setup() {
  //fullScreen();
  //size(2000, 1600);
  size(1550, 1600);
  background(255);
  minim = new Minim(this);

  // Initialize the serial communication
  serialPort = new Serial(this, Serial.list()[0], 9600);
  serialPort.bufferUntil('\n');

  // Load audio files
  drum = minim.loadFile("violin_solo_1.mp3", 2048);
  bass = minim.loadFile("violin_solo_2.mp3", 2048);
  brass = minim.loadFile("Violin_I.mp3", 2048);
  electric = minim.loadFile("Violin_II.mp3", 2048);
  piano = minim.loadFile("Viola.mp3", 2048);
  acoustic = minim.loadFile("Violoncello.mp3", 2048);


  // Initialize BeatDetectors
  beatBass = new BeatDetect(bass.bufferSize(), bass.sampleRate());
  beatDrum = new BeatDetect(drum.bufferSize(), drum.sampleRate());
  beatBrass = new BeatDetect(brass.bufferSize(), brass.sampleRate());

  lastButtonPressTime = millis();
  initialTime = millis();
  center = new PVector(random(width), random(height));

  // Start looping the audio files if not playing
  if (!electric.isPlaying()) {
    electric.loop();
  }
  if (!piano.isPlaying()) {
    piano.loop();
  }
  if (!acoustic.isPlaying()) {
    acoustic.loop();
  }

  // Start looping the audio files if not playing
  if (!bass.isPlaying()) {
    bass.loop();
  }
  if (!drum.isPlaying()) {
    drum.loop();
  }
  if (!brass.isPlaying()) {
    brass.loop();
  }
}

void draw() {
  //debugging statements
  println("Draw function called");
  println("Bass playing: " + bass.isPlaying() + ", Sensor Value: " + sensorValueBass);
  println("Drum playing: " + drum.isPlaying() + ", Sensor Value: " + sensorValueDrum);
  println("Brass playing: " + brass.isPlaying() + ", Sensor Value: " + sensorValueBrass);
  println("Electric playing: " + electric.isPlaying() + ", Sensor Value: " + sensorValueElectric);
  println("Acoustic playing: " + acoustic.isPlaying() + ", Sensor Value: " + sensorValueAcoustic);
  println("Piano playing: " + piano.isPlaying() + ", Sensor Value: " + sensorValuePiano);


  // Check if all sensor values are below 30
  if (sensorValueBass < 70 && sensorValueDrum < 180 && sensorValueBrass < 180 && sensorValueAcoustic < 70 && sensorValueElectric < 70 && sensorValuePiano < 70) {
    if (!timerActive) {
      startTime = millis(); // Start the timer
      timerActive = true;
    }
    if (millis() - startTime > 5000) { // Check if 5 seconds have passed
      fadeLevel = min(fadeLevel + 10, 255); // Increase fade level gradually
    }
  } else {
    timerActive = false; // Reset the timer if any sensor is above 30
    fadeLevel = 0; // Reset fade level
  }

  //detect beat to adjust drawing
  beatBass.detect(bass.mix);
  beatDrum.detect(drum.mix);
  beatBrass.detect(brass.mix);

  // ".isOnset" is related to beat detection, if not detecting beat and only using freq and amp, no need to use
  if (beatBass.isOnset()) {
    drawBassEffect(sensorValueBass);
  }
  if (beatDrum.isOnset()) {
    drawDrumEffect(sensorValueDrum);
  }
  if (beatBrass.isOnset()) {
    drawBrassEffect(sensorValueBrass);
  }

  //draw effects
  if (bass.isPlaying() && sensorValueBass > 40 && bass.mix.level() > 0) {
    drawBassEffect(sensorValueBass);
  }
  if (drum.isPlaying() && sensorValueDrum > 40 && drum.mix.level() > 0) {
    drawDrumEffect(sensorValueDrum);
  }
  if (brass.isPlaying() && sensorValueBrass > 40 && brass.mix.level() > 0) {
    drawBrassEffect(sensorValueBrass);
  }

  if (acoustic.isPlaying() && sensorValueAcoustic > 50 && acoustic.mix.level() > 0) {
    wavesPaused = false;
    if (!waveCreated) {
      createWave(random(0, width));
      waveCreated = true;
    }
    for (int i = 0; i < circleXList.size(); i++) {
      //drawWaves(circleXList.get(i), circleYList.get(i), waveColorList.get(i));
       drawWaves();
    }
    AcousticanalyzeAmplitude(); // Analyze audio amplitude
  } else if (!acoustic.isPlaying() && sensorValueAcoustic < 50) {
    waveCreated = false;
    acoustic.pause();
  }

  if (electric.isPlaying() && sensorValueElectric > 50 && electric.mix.level() > 0) {
    if (!drawNewSpiral) {
      drawNewSpiral = true;
      points.clear();
      center = new PVector(width/2, height/2);
      electricRadius = random(0, 50);
    } else if (!electric.isPlaying() && sensorValueElectric < 50) {
      drawNewSpiral = false;
      points.clear();
    }
    if (drawNewSpiral) {

      float Electricamplitude = electric.mix.level();
      if (Electricamplitude > 0.01) {
        drawSpiral(Electricamplitude);
      }
    }
  }

  if (piano.isPlaying() && sensorValuePiano > 50 && piano.mix.level() > 0) {
    if (!drawingPaused) {
      lines.add(new DottedLine(piano.mix.level()));
      drawingPaused = true;
    }
  } else {
    drawingPaused = false;
    lines.clear(); // Clear the lines list when the sensor value is below 50
  }

  for (DottedLine line : lines) {
    line.update();
    line.display();
  }

  for (int i = lines.size() - 1; i >= 0; i--) {
    if (lines.get(i).finished) {
      lines.remove(i);
    }
  }

  // Apply fading effect
  if (fadeLevel > 0) {
    fill(255, fadeLevel); // White with increasing opacity
    noStroke();
    rect(0, 0, width, height); // Cover the entire screen
  }
}

And, from the Arduino side, we kept it simple with some statements to make sure the values of the sensors were correct:

void setup() {

  Serial.begin(9600);

}

void loop() {

  int sensorValueBass = analogRead(A0);

  int sensorValueDrum = analogRead(A1);

  int sensorValueBrass = analogRead(A2);

  int sensorValueAcoustic = analogRead(A3);

  int sensorValueElectric = analogRead(A4);

  int sensorValuePiano = analogRead(A5);

  Serial.print("B");

  Serial.print(sensorValueBass);

  Serial.print(",");  // comma makes easier for processing to read

  Serial.print("D");

  Serial.print(sensorValueDrum);

  Serial.print(",");

  Serial.print("R");

  Serial.print(sensorValueBrass);  // only println after listing all sensors, data should print as "B###,D###,R###"

  Serial.print(",");  // comma makes easier for processing to read

  Serial.print("A");

  Serial.print(sensorValueAcoustic);

  Serial.print(",");

  Serial.print("E");

  Serial.print(sensorValueElectric);

  Serial.print(",");

  Serial.print("P");

  Serial.println(sensorValuePiano);

  delay(800);  //adjust as needed

}

We also initially integrated potentiometers for each instrument, intending to allow users to manipulate three tracks. However, after user testing, we opted for a cleaner aesthetic and removed the potentiometers. Moreover, sourcing individual tracks for each instrument that harmonized well together, especially when looped, proved challenging. Therefore, we opted for a single orchestral piece from MuseScore, which already contained scores for each instrument/part.

To separate each instrument in MuseScore, we open the full score that we wanted to use, then access the Parts window by selecting “File → Parts…”. In this window, create a new part definition by clicking “New” and fill in details such as the file name and part title. Next, we selected the instruments we wanted to include in this part by marking the relevant boxes. Repeat these steps for each instrument we want to separate. Once all parts are defined, close the Parts window. Finally, the audio files of each instrument were exported and used for our project. 

As for sensors, we initially considered various options like vibration sensors, pressure sensors, tilt sensors, and microphones. However, since we wanted to imitate a cleaner aesthetic reminiscent of an art exhibit, we replaced the sensors with infrared distance sensors and used stands to represent each instrument for a more streamlined look. 

For the stands, Emma found free SVG vector files for both cello, viola, and violin, which perfectly matched our decision to focus on a violin orchestra piece. We chose to apply a contrabass design to the cello to align with the design of the other instruments. Similarly, for the viola, we utilized the violin design but modified it by increasing its size slightly. She then customized them on Cuttle by positioning each instrument in the center of a rectangular piece of wood, with a tab extended at the bottom to fit into a wooden circle,  creating a laser-cut stable stand.

While Emma focused on creating the stands, I took charge of designing the display box for our project. We decided to utilize one of the podiums and position our box on top of it. To begin, I used Maker Case to craft a closed 45 cm by 45 cm by 8 cm box with finger edge joints. Once completed, I downloaded the SVG file and imported it into Cuttle. Within one of the two 45 cm by 45 cm squares, I strategically placed six circles equidistant from the edge of the square and centered them. These circles are intended to be rastered. Inside each circle, I included a rectangle measuring 3 cm by 0.9 cm, precisely sized to snugly accommodate the infrared sensor without exposing too much of it. Afterward, I assembled all the pieces of the display box and glued them together. To maintain a polished appearance consistent with the podium, I painted the sides white. Now, we have the completed display box!

While we maintained the original drawings and colors due to time constraints during the transition to an orchestra piece, we ensured the vibrancy of pastel rainbow colors. This choice facilitated user comprehension by allowing them to easily differentiate between instruments and associated drawings while maintaining a cohesive visual aesthetic.

Here are our final products!:

Orchestra! Presentation from the Top View:

 

Screen Recording of Orchestra!‘s Drawing Side:


 CONCLUSIONS

In revisiting the aims of our project, Orchestrate! embarked on a journey aimed at empowering users to transform music into visual art, fostering an immersive experience where individuals could act as conductors, shaping each instrument to craft their visual masterpiece.

Upon reflection, Orchestrate! unequivocally achieved its intended objectives. We successfully crafted a platform where users could intuitively control both drawing gestures and music volume, resulting in a seamless fusion of auditory and visual expression. The evolution from our initial concept, which leaned towards mimicking instrument-playing actions, to a more artistically focused approach underscores our dedication to adapting to user preferences while upholding principles of simplicity and elegance.

Throughout development, user interaction remained paramount. Feedback garnered from user testing provided invaluable guidance, steering our decisions and ultimately enhancing the project’s overall user experience. Through the integration of a comprehensive orchestra score sourced from MuseScore, Orchestrate! transcended into a virtual orchestra conducting session, enriching its immersive qualities.

From challenges emerged profound insights. Our shift from complex wiring to an intuitive interface emphasized the importance of user-centered design and iterative development. Difficulty in selecting appropriate audio recordings similarly influenced the trajectory of our project, to which we integrated audio recordings from MuseScore, underscoring the critical role of sound quality in enhancing the project’s ambiance and engagement. Lastly, coding for each drawing posed its own difficulties initially, requiring perseverance to achieve our desired outcome. Though not flawless, our efforts ultimately enabled users to experience the vision we aimed to convey.

Looking ahead, given additional time, our focus would be on fine-tuning the user interface and interaction design. A potential enhancement involves enabling users to draw different shapes corresponding to the stand they lift. For instance, each stand could be associated with a unique drawing, and lifting multiple stands simultaneously could generate different distinct shapes or drawings. This feature would add depth and creativity to the user experience, fostering exploration and artistic expression.

Furthermore, broadening the range of available musical scores is another avenue for improvement. By diversifying the repertoire, users would have access to a wider array of compositions, offering greater variety and customization options. This expansion would cater to diverse preferences and enrich the overall Orchestrate! experience.

In essence, Orchestrate! serves as a testament to the transformative potential of collaboration, adaptability, and user feedback in the iterative design process. As we celebrate our achievements, we embrace the invaluable lessons gleaned from setbacks and failures, propelling us toward continued growth and innovation in future endeavors.

DISASSEMBLY

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *