Morgan's Website

  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Home
  • Group Project Research
  • Group Project Read
  • Finished Group Research Project
  • Midterm Project: Proposal
  • Midterm Term Project: Make, Present, Report
  • Final Project Essay
  • Study Buddy – Morgan Somchanhmavong – Professor Rudi
  • Actuators and Mechanisms
  • Animated Poster
  • Arduino Basics
  • Digital Fabrication
  • Electronics and Soldering
  • Final Project Essay
  • Finished Group Research Project
  • Group Project Read
  • Group Project Research
  • Image and Video
  • Midterm Project: Proposal
  • Midterm Term Project: Make, Present, Report
  • Neopixel Music Visualization
  • Preparatory Research and Analysis
  • Processing Basics
  • Serial Communication
  • Study Buddy – Morgan Somchanhmavong – Professor Rudi
  • Three Project Proposal
  • Workout

Study Buddy – Morgan Somchanhmavong – Professor Rudi

December 19, 2022 by Morgan Somchanhmavong Leave a Comment

Conception and Design:

I’m someone who likes studying in intervals, which greatly allows for my productivity to stay high for longer in contrast to studying for hours and hours which greatly declines faster. Below is a graph that shows the difference between studying with breaks vs without. 


This graph shows the productivity of a study session with and without breaks. As you can see, after 5 hrs, the one who took breaks in between had higher productivity for longer. That’s why I always study in intervals. However, I use my phone as a timer. I do 30 minutes of focus work and then take a 5-10 minute break after each 30 min interval.  After 30 min, though, since I use my phone as a timer, I have to go on my phone and turn off the alarm. Oftentimes when after turning my alarm off since my phone is already in my hands, I get distracted by it, and my breaks go from 5 minutes to 10 minutes to 20 minutes, and so on and so forth. This ultimately ruins the point of studying in intervals, as my productivity declines anyway. 

The goal of my Project:

My project proposes a focus study environment that encourages focused work. This is done by eliminating the risk of getting distracted by cell phones, as it’s a self-sufficient circuit that only needs the user to interact with it to work. It promotes the idea of “interval studying.” The circuit uses LED lights around the user’s study area to create an enjoyable and entertaining study session while adding to the ambiance. Often times when I study, I listen to music, so this circuit allows you to upload music the lights will react to the music, and the brightness changes based on the amplitude of the song and changes colors every 10 seconds. This adds to the enjoyability of studying as it creates an environment that interacts with the music used to study. Then after the study session is over, a new Neo pixel system is used for however long the user wants to study. The first mode, where the lights are reacting to the music, makes all the LEDs one color at a time, and when the colors change, all of the LEDs change to the same color as well. However, when the study session is over, the alarm goes off. The Neopixel lights turn to random RGB colors and react to the amplitude of the alarm. To turn this system and alarm off, the user must put his/her hand anywhere from <5cm from the ultrasonic, sonic sensor on the study buddy’s left side. Then once the system is turned off, to restart the study session mode, the user must place their hand 10cm<x<30cm away from the sensor. This will reset the system back to the first mode, the study mode. This project is supposed to promote interval studying while limiting external distractions as much as possible. The way my project is set up in a scaled-down version of how the system would work. The study buddy creates a study-inducive environment meaning that the lights would most likely be hung up in the room where the user is studying and not in the box I fabricated. However, the user could use it in the box format, too if they liked because I know that some people believe that the “cublic-esque” feeling adds to their focus as it essentially traps them in a new space. My decisions for the project are justified by what I would like in a study buddy. I like to study music, so I incorporated that into my project.I also believe that light interactivity with music is super cool, so I included that. I also thought using the ultrasonic was a new way of adding interactivity vs. pushing a button. So I used that hand motion as the trigger for functions rather than a button. 

Fabrication and Production: 

The most significant step in my production process was looking back at recitation 7 and the code. This opened my eyes to new possibilities. At first, I was just going to make the LED light system. However, Rudi said that it was too dull and needed more interactivity. So I looked back at recitation 7 and found out that I could make a lighting mode for when the user is studying that is not too distracting. Rudi suggested that I use an infrared sensor to control the LEDs. I also thought this was a great idea but thought it might be too much because there are many different hand gestures that I would have to account for. I also had trouble setting up the infrared sensors. I was going to make them into a housing unit like a watch that the user would wear, but I thought that would be annoying since the user would be connected through wires and might find wearing the sensor troublesome. However, I was having trouble getting the infrared sensors to work, so I ended up switching to ultrasonic and used different distance values to trigger different functions. I think the ultrasonic sensor was the best for my project as it accurately measured the distance the user would place their hand away from the sensor. This is all the data I needed for my circuit to function. I wanted to make a simple circuit that wasn’t too complicated or required the user to do too much. Since studying is already complicated, there’s no need to overcomplicate the helper. A significant failure of the project was everything not working together. I essentially had three working modes for my project. But the problem was merging them together so they could work one with one another. I think this was a problem with Arduino not being unable to run multiple modes simultaneously. This is because even when I merged the code, the circuit wasn’t working well together. But all three aspects of the project were fully functioning. The study music lighting sequence worked and reacted to the music. The alarm sequence also worked well and reacted to the alarm’s amplitude. It differed from the study lighting sequence as it used RGB. Lastly, the ultrasonic sensor worked well as well since it was reading the distance values of my hand when I interacted with it. The problem lay in getting everything to work with one another. 


 

Conclusion:

All in all, my project was meant to create a study, focus-inducive environment by allowing the user to be immersed and interact with the study buddy while simultaneously limiting external distractions. My project achieved my goals for it to some extent. I think my audience interacted with my project well despite not all the working parts being able to cooperate with one another. I think the audience was able to see what I was trying to accomplish and were able to interact with what was working. I define interaction as an exchange of actions between participants, such as listening, thinking, and speaking. During this encounter, they will react to one another and then act based on their reactions. My project aligns with this definition as the user interacts with the ultrasonic sensor to either turn the circuit off or restart it. Furthermore, the music being played causes different lighting sequences to be carried out based on what song is being played. Thus, there is an exchange of actions between the lights and music. If I had more time, I would use another Arduino so the three modes I have could operate together. Or find another way to make my three components work with one another. I would also make the study area bigger for those who prefer the cubicle-style study environment. This will give them more space to do their work, and I can increase the ultrasonic sensor’s distance interval making it easier to turn off or reactivate my circuit. I learned from setbacks that formatting my code is essential as it allows me to read it better and follow the work I was doing more easily. Other setbacks required me to reach out to professors and Fellows for help, which I greatly benefitted from. I learned to reach out more and not be afraid to ask for help. Some things I take away from my accomplishments are the rewarding feeling of getting something to work on after hours and hours of problem-solving. As well as time management skills. It took a lot of time management skills to finish this project as I needed to plan how much work I would do at certain times So I wouldn’t get burnt out. I think people should care as time management can apply to many other things, not just this project. This can be applied to jobs, other school work, and social relationships. Time management is an essential skill to have. I like the basis of my project and its goals. It poses a unique solution for a common problem that many people, including myself, struggle with. Unfortunately, it doesn’t work as well as I had hoped it would but I did learn a lot from this project and had a lot of fun doing it!

Annex:

This is the final wiring of my project.


This was when I tried to use an infrared sensor as the trigger for my other lighting sequences. I then had trouble with it as the values were not picked up in the serial monitor, so I changed it to the ultrasonic sensor. But the plan was to use infrared sensors to read the user’s hand gestures accurately. Unfortunately, I didn’t even get one to work, so two were out of the question. 

I laser cut the “study box” I used for my presentation to present my project.
I then laser cut a smaller box to house the ultrasonic sensor and placed that in the cutout on the left of the study box. 

Above is a video of the acrylic box I used to house my ultrasonic sensor being laser cut.  

Here me struggling with the alarm code

3 components:

Study Music

Alarm Sequence:

Ultrasonic Sensor:

 

My Code:

Processing

Study Music Code:

import processing.sound.*;

SoundFile sample;
Amplitude analysis;

void setup() {
size(640, 480);

// load and play a sound file in a loop
sample = new SoundFile(this, "e.mp3");
sample.loop();

// create the Amplitude analysis object
analysis = new Amplitude(this);
// analyze the playing sound file
analysis.input(sample);
}

void draw() {
println(analysis.analyze());
background(125, 255, 125);
noStroke();
fill(255, 0, 150);

// analyze the audio for its volume level
float volume = analysis.analyze();

// volume is a number between 0.0 and 1.0
// map the volume value to a useful scale
float diameter = map(volume, 0, 1, 0, width);
// draw a circle based on the microphone amplitude (volume)
circle(width/2, height/2, diameter);
}

Final code that incorporated all elements:

import processing.sound.*;
import processing.serial.*;
import osteele.processing.SerialRecord.*;

PFont font;
String time = "010";
int t;
int interval = 10;
int former;

Serial serialPort;
SerialRecord serialRecord;
SoundFile sample1;
SoundFile sample2;
Amplitude analysis;
FFT fft;

int bands = 1024;
float smoothingFactor = 0.1;
float[] sum = new float[bands];
int scale = 10;
float barWidth;

int W; //width of the tiles
int NUM = 60; //amount of pixels
int[] r = new int[NUM]; //red of each tile
int[] g = new int[NUM]; //red of each tile
int[] b = new int[NUM]; //red of each tile

void setup() {

size(1450, 800);
background(0);

barWidth = 10*width/float(bands);

// load and play a sound file in a loop
sample1 = new SoundFile(this, "e.mp3");
sample2 = new SoundFile(this, "final.mp3");
sample1.loop();

fft = new FFT(this, bands);
fft.input(sample1);

// create the Amplitude analysis object
analysis = new Amplitude(this);
// analyze the playing sound file
analysis.input(sample1);

W = width/NUM;

// You can use this syntax and change COM3 for your serial port
// printArray(Serial.list());
// serialPort = new Serial(this, "COM3", 9600);
// in MacOS it looks like "/dev/cu.usbmodem1101"
//or you can try to use this instead:

String serialPortName = SerialUtils.findArduinoPort();
serialPort = new Serial(this, serialPortName, 9600);
serialRecord = new SerialRecord(this, serialPort, 6);
serialRecord.logToCanvas(false);
rectMode(CENTER);

}

void draw() {

//background(0);

// if (mousePressed == true) {
// int n = floor(constrain(mouseX/W , 0, NUM-1));

// r[n] = floor(random(255));
// g[n] = floor(random(255));
// b[n] = floor(random(255));

// serialRecord.values[0] = n; // which pixel we change (0-59)
// serialRecord.values[1] = r[n]; // how much red (0-255)
// serialRecord.values[2] = g[n]; // how much green (0-255)
// serialRecord.values[3] = b[n]; // how much blue (0-255)
// serialRecord.send(); // send it!
// }

// println(analysis.analyze());
// background(125, 255, 125);
// noStroke();
// fill(255, 0, 150);

// // analyze the audio for its volume level
// float volume = analysis.analyze();

// // volume is a number between 0.0 and 1.0
// // map the volume value to a useful scale
// int diameter = floor(map(volume, 0, 0.6, 0, 60));

// for(int n = 0; n < diameter; n++) {
// r[n] = floor(random(255));
// g[n] = floor(random(255));
// b[n] = floor(random(255));

// serialRecord.values[0] = n; // which pixel we change (0-59)
// serialRecord.values[1] = r[n]; // how much red (0-255)
// serialRecord.values[2] = g[n]; // how much green (0-255)
// serialRecord.values[3] = b[n]; // how much blue (0-255)
// serialRecord.send();
// }

// if(former > diameter) {
// for(int m = diameter; m < former; m ++) {
// r[m] = 0;
// g[m] = 0;
// b[m] = 0;

// serialRecord.values[0] = m; // which pixel we change (0-59)
// serialRecord.values[1] = r[m]; // how much red (0-255)
// serialRecord.values[2] = g[m]; // how much green (0-255)
// serialRecord.values[3] = b[m]; // how much blue (0-255)
// serialRecord.send();
// }
// }

// former = diameter;
println(analysis.analyze());
background(0);
long t = millis();
fft.analyze();

float volume = analysis.analyze();

// volume is a number between 0.0 and 1.0
// map the volume value to a useful scale

float diameter1 = floor(map(volume, 0, 1, 0, width));

if (t < 18000) {
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;

// Draw the rectangles, adjust their height using the scale factor
fill(0, 0, (diameter1)/3);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int blue = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
serialRecord.values[2] = blue;
serialRecord.values[3] = 0;
serialRecord.values[4] = 0;
serialRecord.send();
} else if ((t >= 18000) && t < 28000) {
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;
// Draw the rectangles, adjust their height using the scale factor
fill((diameter1)/3, 0, (diameter1)/3);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int purple = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
serialRecord.values[2] = 0;
serialRecord.values[3] = purple;
serialRecord.values[4] = 0;
serialRecord.send();
} else if ((t >= 28000) && t < 38000) {
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;
// Draw the rectangles, adjust their height using the scale factor
fill(0, (diameter1)/3, 0);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int green = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = 0;
serialRecord.values[1] = green;
serialRecord.values[2] = 0;
serialRecord.values[3] = 0;
serialRecord.values[4] = 0;
serialRecord.send();
} else if ((t >= 38000) && t < 48000) {
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;
// Draw the rectangles, adjust their height using the scale factor
fill((diameter1)/3, 0, 0);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int red = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = red;
serialRecord.values[1] = 0;
serialRecord.values[2] = 0;
serialRecord.values[3] = 0;
serialRecord.values[4] = 0;
serialRecord.send();
} else if ((t >= 48000) && t < 58000) {
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;
// Draw the rectangles, adjust their height using the scale factor
fill(0, (diameter1)/3, (diameter1)/3);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int skyblue = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
serialRecord.values[2] = 0;
serialRecord.values[3] = 0;
serialRecord.values[4] = skyblue;
serialRecord.send();
} else if ((t >= 58000) && t < 68000) {
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;
// Draw the rectangles, adjust their height using the scale factor
fill(0, 0, (diameter1)/3);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int blue = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
serialRecord.values[2] = blue;
serialRecord.values[3] = 0;
serialRecord.values[4] = 0;
serialRecord.send();

} else {
sample1.stop();

background(0);
// load and play a sound file in a loop
sample2 = new SoundFile(this, "final.mp3");
sample2.loop();

// create the Amplitude analysis object
analysis = new Amplitude(this);
// analyze the playing sound file
analysis.input(sample2);

//if (mousePressed == true) {
// int n = floor(constrain(mouseX/W , 0, NUM-1));

// r[n] = floor(random(255));
// g[n] = floor(random(255));
// b[n] = floor(random(255));

// serialRecord.values[0] = n; // which pixel we change (0-59)
// serialRecord.values[1] = r[n]; // how much red (0-255)
// serialRecord.values[2] = g[n]; // how much green (0-255)
// serialRecord.values[3] = b[n]; // how much blue (0-255)
// serialRecord.send(); // send it!
// }

println(analysis.analyze());
// analyze the audio for its volume level
serialRecord.read();
int distance = serialRecord.values[5];

if ( distance < 5 ) {
sample2.stop();
sample1.stop();
//for(int n = 0; n < diameter; n++) {
//r[n] = floor(random(255));
//g[n] = floor(random(255));
//b[n] = floor(random(255));

serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
serialRecord.values[2] = 0;
serialRecord.values[3] = 0;
serialRecord.send();
//serialRecord.values[0] = n; // which pixel we change (0-59)
//serialRecord.values[1] = r[n]; // how much red (0-255)
//serialRecord.values[2] = g[n]; // how much green (0-255)
//serialRecord.values[3] = b[n]; // how much blue (0-255)
//serialRecord.send();
}
if((distance >= 10) && distance < 25) {
sample1.loop();
sample2.stop();
for (int i = 0; i < bands; i++) {
// Smooth the FFT spectrum data by smoothing factor
sum[i] += (fft.spectrum[i] - sum[i]) * smoothingFactor;

// Draw the rectangles, adjust their height using the scale factor
fill(0, 0, (diameter1)/3);
rect(i*barWidth, height, barWidth, -sum[i]*height*scale);
}
int blue = floor(map(volume, 0, 1, 0, 255));
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
serialRecord.values[2] = blue;
serialRecord.values[3] = 0;
serialRecord.send();
// for(int m = diameter; m < former; m ++) {
// r[m] = 0;
// g[m] = 0;
// b[m] = 0;

// serialRecord.values[0] = m; // which pixel we change (0-59)
// serialRecord.values[1] = r[m]; // how much red (0-255)
// serialRecord.values[2] = g[m]; // how much green (0-255)
// serialRecord.values[3] = b[m]; // how much blue (0-255)
// serialRecord.send();
// }
//}
// former = diameter;
}

//r[diameter] = floor(random(255));
//g[diameter] = floor(random(255));
//b[diameter] = floor(random(255));

//serialRecord.values[0] = diameter; // which pixel we change (0-59)
//serialRecord.values[1] = r[diameter]; // how much red (0-255)
//serialRecord.values[2] = g[diameter]; // how much green (0-255)
//serialRecord.values[3] = b[diameter]; // how much blue (0-255)
//serialRecord.send();
//draw a circle based on the microphone amplitude (volume)
//circle(width/2, height/2, diameter);
}
}

Arduino

Alarm system code:

#include "SerialRecord.h"
#include <FastLED.h>
#define NUM_LEDS 60 // How many leds in your strip?
#define DATA_PIN 3 // Which pin are you connecting Arduino to Data In?
CRGB leds[NUM_LEDS];
// Change this number to the number of values you want to receive
SerialRecord reader(4);
void setup() {
Serial.begin(9600);
FastLED.addLeds<NEOPIXEL, DATA_PIN>(leds, NUM_LEDS); // Initialize
FastLED.setBrightness(10); // BEWARE: external power for full (255)
//further info at https://learn.adafruit.com/adafruit-neopixel-uberguide/powering-neopixels
}
void loop() {
if (reader.read()) {
int n = reader[0];
int r = reader[1];
int g = reader[2];
int b = reader[3];
leds[reader[0]] = CRGB(reader[1], reader[2], reader[3]); // Prepare the color information using CRGB( Red, Green, Blue
FastLED.show(); // Pass the information of color to the LED
}
}
 

Final Arduino code

#include "SerialRecord.h"
#include <FastLED.h>
#include <NewPing.h>
// #define echoPin 11 // attach pin D2 Arduino to pin Echo of HC-SR04
// #define trigPin 12 //attach pin D3 Arduino to pin Trig of HC-SR04
#define PING_PIN1 11
#define MAX_DISTANCE 400
#define NUM_LEDS 60 // How many leds in your strip?
#define DATA_PIN 3 // Which pin are you connecting Arduino to Data In?
CRGB leds[NUM_LEDS];
// Change this number to the number of values you want to receive
SerialRecord reader(5);
SerialRecord writer(1);
NewPing sonar1(PING_PIN1, PING_PIN1, MAX_DISTANCE);
long duration; // variable for the duration of sound wave travel
int distance; // variable for the distance measurement
void setup() {
Serial.begin(9600);
FastLED.addLeds<NEOPIXEL, DATA_PIN>(leds, NUM_LEDS); // Initialize
FastLED.setBrightness(30); // BEWARE: external power for full (255)
//further info at https://learn.adafruit.com/adafruit-neopixel-uberguide/powering-neopixels
// pinMode(trigPin, OUTPUT); // Sets the trigPin as an OUTPUT
// pinMode(echoPin, INPUT); // Sets the echoPin as an INPUT
Serial.begin(9600); // // Serial Communication is starting with 9600 of baudrate speed
Serial.println("Ultrasonic Sensor HC-SR04 Test"); // print some text in Serial Monitor
Serial.println("with Arduino UNO R3");
}
void loop() {
if (reader.read()) {
int r = reader[0];
int g = reader[1];
int b = reader[2];
int p = reader[3];
int s = reader[4];
 
if (r != 0) {
for (int i = 0; i < NUM_LEDS; i++) {
leds[i] = CRGB(r, 0, 0);
}
FastLED.show();
} if (g != 0) {
for (int i = 0; i < NUM_LEDS; i++) {
leds[i] = CRGB(0, g, 0);
}
FastLED.show();
} if (b != 0) {
for (int i = 0; i < NUM_LEDS; i++) {
leds[i] = CRGB(0, 0, b);
}
FastLED.show();
} if (p != 0) {
for (int i = 0; i < NUM_LEDS; i++) {
leds[i] = CRGB(p, 0, p);
}
FastLED.show();
} if (s != 0) {
for (int i = 0; i < NUM_LEDS; i++) {
leds[i] = CRGB(0, s, s);
}
FastLED.show();
leds[reader[0]] = CRGB(reader[1], reader[2], reader[3]); // Prepare the color information using CRGB( Red, Green, Blue
FastLED.show();
}
}
// Clears the trigPin condition
// digitalWrite(trigPin, LOW);
// delayMicroseconds(2);
// // Sets the trigPin HIGH (ACTIVE) for 2 microseconds
// digitalWrite(trigPin, HIGH);
// delayMicroseconds(2);
// digitalWrite(trigPin, LOW);
// // Reads the echoPin, returns the sound wave travel time in microseconds
// duration = pulseIn(echoPin, HIGH);
// // Calculating the distance
// distance = duration * 0.034 / 2; // Speed of sound wave divided by 2 (go and back)
// // Displays the distance on the Serial Monitor
// Serial.print("Distance: ");
// Serial.print(distance);
// Serial.println("cm");
delay(50);
int distance = sonar1.ping_cm();
Serial.print("Ping1: ");
Serial.print(distance);
Serial.println("cm");
writer[5] = distance;
writer.send();
delay(50);
}

Filed Under: Uncategorized

Image and Video

December 4, 2022 by Morgan Somchanhmavong Leave a Comment

In this recitation, we worked on our exploring physical controllers to modify the way media is being shown. The type of media I chose was video. I built a circuit with Arduino using a potentiometer. The potentiometer would theoretically change the camera and apply a filter when inputted a specific value. The filter I chose was called time displacement, which warped the camera and made the movements lag behind, creating a weird visual.

 After creating the circuit, I used an example code in processing to activate the webcam on my camera. 

I then found the Time displacement code in example filters and merged it with the receive single value code and webcam code.  The Arduino code was basic and just send a single value from the potentiometer. The send single value code was modified to print the potentiometer’s sensor value so processing knew which specific value it had to read to then activate the filter. 

Arduino Code:

#include "SerialRecord.h"
// Change this number to send a different number of values
SerialRecord writer(1);
void setup() {
Serial.begin(9600);
}
void loop() {
int sensorValue = analogRead(0);
 
writer[0] = sensorValue;
writer.send();
Serial.print(sensorValue);
// This delay slows down the loop, so that it runs less frequently. This
// prevents it from sending data faster than a Processing sketch that runs at
// 60 frames per second will process it. It also makes it easier to debug the
// sketch, because values are received at a slower rate.
delay(20);
}
The time displacement filter worked and merged it with the received single value, so you can see the input from the potentiometer values on the bottom left.
 
The last step required creating an “if” “else” statement in processing. If the potentiometer value was higher than 500000 then it would apply the displacement filter to the video. If it was lower the webcam would stay normal. 
 
The Code:

import processing.video.*;
import processing.serial.*;
import osteele.processing.SerialRecord.*;
String[] cameras = Capture.list();
Capture cam;
Serial serialPort;
SerialRecord serialRecord;

Capture video;
int signal = 0;

//the buffer for storing video frames
ArrayList frames = new ArrayList();

void setup() {
size(640, 480);
printArray(cameras);
cam = new Capture(this, 640, 480, cameras[0]); // use if camera trouble: cam = new Capture(this, 640, 480, cameras[0],30);
cam.start();
// This the default video input, see the GettingStartedCapture
// example if it creates an error
video = new Capture(this, width, height);

// Start capturing the images from the camera
video.start();
String serialPortName = SerialUtils.findArduinoPort();
serialPort = new Serial(this, serialPortName, 9600);
serialRecord = new SerialRecord(this, serialPort, 1);
}

void captureEvent(Capture camera) {
camera.read();

// Copy the current video frame into an image, so it can be stored in the buffer
PImage img = createImage(width, height, RGB);
video.loadPixels();
arrayCopy(video.pixels, img.pixels);

frames.add(img);

// Once there are enough frames, remove the oldest one when adding a new one
if (frames.size() > height/4) {
frames.remove(0);
}
}

void draw() {
background(0);

serialRecord.read();
int value = serialRecord.get();
if (value > 500000) {
int currentImage = 0;

loadPixels();

// Begin a loop for displaying pixel rows of 4 pixels height
for (int y = 0; y < video.height; y+=4) {
// Go through the frame buffer and pick an image, starting with the oldest one
if (currentImage < frames.size()) {
PImage img = (PImage)frames.get(currentImage);

if (img != null) {
img.loadPixels();

// Put 4 rows of pixels on the screen
for (int x = 0; x < video.width; x++) {
pixels[x + y * width] = img.pixels[x + y * video.width];
pixels[x + (y + 1) * width] = img.pixels[x + (y + 1) * video.width];
pixels[x + (y + 2) * width] = img.pixels[x + (y + 2) * video.width];
pixels[x + (y + 3) * width] = img.pixels[x + (y + 3) * video.width];
}
}

// Increase the image counter
currentImage++;
} else {
break;
}
}

updatePixels();
} else {
if (cam.available()) {
cam.read();
}

image(cam, 0, 0);
}

float a = map(value, 0, 1024, 0, height);
line(a, 0, a, height);
// Set the image counter to 0

// For recording an image sequence
//saveFrame("frame-####.jpg");
}

This recitation was a success as I was able to use a physical controller, the potentiometer, to control my video. 


 

Filed Under: Uncategorized

Digital Fabrication

December 2, 2022 by Morgan Somchanhmavong Leave a Comment

In this recitation, we made our own kinetic sculptures using digital fabrication methods such as laser cutting. I worked alone, so only one set of designs made up my sculpture. To start, we used cuttle to model our designs for cutting. I started by designing a base plate for my sculpture following the recitation instructions. Once I had the base plate completed, I began designing my design using the star shape. I took the star shape and modified it with a rotational mirror. This made multiple copies of my original shape, and then I made them overlap to make an attractive design. Afterward, I added a stroke to make the overlapping lines thicker. 



After designing my designs, I went to the fabrication lab to cut them out. When I was there, it was a reasonably straightforward process. I uploaded my designs to the computer connected to the laser-cutting machine and chose the material I wanted to use. Then in the settings, select the material and start printing. 

I thought the laser cutting was incredibly fascinating! It’s so accurate and makes such straight lines. Watching the machine blew my mind and was encapsulated by its movements. 

 

After cutting out my designs, I took them and put them together. I put the base plate on the bottom and connected a servo motor to my acrylic design so the motor would spin it. I then connected it to Arduino and ran the sweep code In Arduino’s “example” library. Overall, I had a lot of fun during this recitation. Having the freedom to design anything I wanted for my sculpture and then laser-cutting was quite enjoyable. 

Filed Under: Uncategorized

Final Project Essay

November 23, 2022 by Morgan Somchanhmavong Leave a Comment

Title: Good Morning Sunshine

Project Statement: 

Being responsible for one’s self and responsibilities is essential as a college student. These responsibilities include waking up in time for class, from a nap, or for an activity. The basic built-in phone alarms are boring and practical to some degree. The problem with these alarms is that they’re incredibly obnoxious and too easy to hit, snooze and fall back asleep. My alternative incorporates the old alarm system with LEDs to make a visually appealing and informational light-up sequence when the alarm goes off. Our work inspired this project with Arduino, Processing, and the LED strip in our recent recitation, having it react to music. This project aims to create a more effective and enjoyable method of waking up so the users will never be late again. 

Project Proposal: 

My project aims to assist college, and high school students, or anyone for that matter, in waking up. Even though people can wake up to the old alarms, they will wake up annoyed and groggy due to the gloomy mood of the environment when being forced out of bed. My project, using LEDs, will create a more cheerful environment when waking up, making the user feel less depressed. I can empathize with my users very well since I am a college student who often has trouble waking up. I will analyze the bugs I see throughout the process and adjust my project accordingly. 

Plan:

  1. I will start my project by looking back at the process we did in the recitation about making Arduino and Processing communicate with the LEDs to understand the process thoroughly. 

      2. I will then sketch out the circuit I am going to make

      3.  construct the physical circuit and workout the Arduino and Processing code to make sure they are communicating properly

      4. get a file of an alarm sound or some other sound to wake up to and upload it to the processing library and make sure the LEDs react to the processing code

      5. code a wakeup sequence, and one for my snooze is hit

Context and Significance:

Preparatory research helped me develop my idea since I knew I was interested in working with LEDs, so I narrowed my research to LED-based interactive projects. I then used those other projects to influence this one in a way that solved a problem that I had. I define interaction as the exchange of actions and reactions in the art system among participants. My project is interactive because the LED light sequence activates when the alarm triggers it, reacting to a participant in the circuit. Then it responds to the action of the alarm by starting the “good morning” sequence. My project is based on the recitation we did in class, but it is unique as it is being converted to the real world to solve. A problem people struggle with, which is waking up. Subsequent projects may build upon creating a more wireless-friendly circuit. My circuit will be built using a breadboard and an Arduino, causing wires to be up around the bed area, making it a little dangerous and troublesome. As a real-world product, it would make more sense for it to be controlled via a cellphone rather than using Arduino and processing on my laptop. If it could be made into an app and have the circuit already built into the LEDs, this product would be finalized. However, I cannot do this due to time constraints and a lack of app coding knowledge. This is something to consider building on for subsequent projects. 

Overall, I like the idea of my project as it addresses the real-world problem that the average human deals with, waking up. My project will wake the user up effectively in a more enjoyable way via a visually appealing “good morning” light sequence. This is a project I could see being put on the market as a real-world product, as its applications are vast. 

Filed Under: Uncategorized

Serial Communication

November 22, 2022 by Morgan Somchanhmavong Leave a Comment

For this recitation, the first few steps included building a circuit with two potentiometers and writing an Arduino sketch that reads their values and sends them serially. I didn’t have trouble building the circuit as it was a similar process to what we’ve done in previous recitations many times. I tested the connections by opening the serial monitor in Arduino and turning the potentiometers to see if the values were reactive to the potentiometers’ motion, meaning that the potentiometers were working. 

 After knowing I had a working circuit, I wrote a processing sketch drawing a circle and read the two analog values from Arduino.  The sketch modifies the circle’s x and y values based on the input values from Arduino from the potentiometers connected to the circuit. 


 The circle moved up and down based on the potentiometer values. I had trouble creating lines from the movement of the circle. It was supposed to emulate an etch-a-sketch and create a drawing from the movements of plastic knobs requiring similar motion as potentiometers. 

Code:

Arduino Code:

#include "SerialRecord.h"
// Change this number to send a different number of values
SerialRecord writer(2);
void setup() {
Serial.begin(9600);
}
void loop() {
int sensorValue = analogRead(A0);
writer[0] = millis() % 1024;
writer[1] = sensorValue;
writer.send();
// This delay slows down the loop, so that it runs less frequently. This can
// make it easier to debug the sketch, because new values are printed at a
// slower rate.
delay(10);
}
 
Processing Code:

import processing.serial.*;
import osteele.processing.SerialRecord.*;

Serial serialPort;
SerialRecord serialRecord;

void setup() {
size(500, 500);

String serialPortName = SerialUtils.findArduinoPort();
serialPort = new Serial(this, serialPortName, 9600);

// If the Arduino sketch sends a different number of values, modify the number
// `2` on the next line to match the number of values that it sends.
serialRecord = new SerialRecord(this, serialPort, 2);
}

void draw() {
background(0);

serialRecord.read();
int value1 = serialRecord.values[0];
int value2 = serialRecord.values[1];

float x = map(value1, 0, 1024, 0, width);
float y = map(value2, 0, 1024, 0, height);
circle(x, y, 20);
}

The recitation was fantastic in using the physical circuits to create visual drawings on our computers. I never thought of the parallel between the etch-a-sketch and the potentiometers’ motion. 
 
Task #2
My partner’s name is Shelly, and we worked together on task #2, for we had to make a ball bounce from the left to the right side of the screen. We also tried to use two servo motors to create a sort of hand-slapping-like motion to create a more interactive visual. However, we were only able to get one servo to work. We built the new circuit off the old one, connecting the servos to the existing breadboard.

 
We worked well together by dividing up the work. I focused more on the physical building of the circuit while Shelly did most of the coding. We would help each other out when we ran into problems. 
 
Arduino Code:
#include "SerialRecord.h"
#include <Servo.h>
Servo myservo;
Servo myservo2;
int step;
int pos;
// Change this number to the number of values you want to receive
SerialRecord reader(2);
void setup() {
// Servos
myservo.attach(9);
myservo2.attach(8);
myservo.write(0);
myservo2.write(0);
Serial.begin(9600);
}
void loop() {
if (reader.read()) { //Rudi helped and suggested
if (reader[0] == 1) {
for (pos = 0; pos < 120; pos += 3) {
myservo.write(pos);
delay(3);
}
for (pos = 120; pos > 0; pos -= 3) {
myservo.write(pos);
delay(10);
}
}
if (reader[1] == 1) {
for (pos = 0; pos < 120; pos += 3) {
myservo2.write(pos);
delay(3);
}
for (pos = 120; pos > 0; pos -= 3) {
myservo2.write(pos);
delay(10);
}
}
}
}
 
Processing Code:

import processing.serial.*;
import osteele.processing.SerialRecord.*;

Serial serialPort;
SerialRecord serialRecord;
int step=2;
int x=0;
void setup() {
fullScreen();
size(500, 500);
background(0);
noCursor();

String serialPortName = SerialUtils.findArduinoPort(2);
serialPort = new Serial(this, serialPortName, 9600);
serialRecord = new SerialRecord(this, serialPort, 2);
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;

serialRecord.send();
}

void draw() {
background(0);

//float x = map(value1, 0, 1024, 0, width);
x+=step;
//if (x>=90|| x<=0){
//step=-1*step;
//}
float r=radians(x);
float move=sin(r)*width/2+width/2;

circle(move, height/2, 80);
serialRecord.values[0] = 0;
serialRecord.values[1] = 0;
if (x%360==180+step) {
serialRecord.values[0] = 1;
//serialRecord.send();
} else if (move==0) {
serialRecord.values[1] = 1;
//serialRecord.send();
}
serialRecord.send();
}

We found this task quite challenging. Mainly the coding aspect of getting the servos to do precisely what we wanted, such as range of motion and timing, was challenging. The initial task also gave me coding difficulties. I’ll need to practice and improve my coding knowledge to be more proficient during the recitations. 

Filed Under: Uncategorized

Three Project Proposal

November 21, 2022 by Morgan Somchanhmavong Leave a Comment

  • The Swipe Light

As someone who enjoys working at my desk, whether doing schoolwork, watching shows, or playing video games, I enjoy using my desk. However, my setup is a little dull and could use some upgrades. I suggest putting LED lights around my setup to make it look more appealing. But not just any LEDs. These will be “interactive LEDs” that will use a motion sensor on the right side of the desk near the mouse. The sensor will be used to read my hand movements on the desk, such as swiping up and down, swiping left and right, or double tapping the desk, making the LEDs react differently. Specifically, swiping right will change the colors of the overall LEDs, and swiping left will bring it back to the previous color, then swiping up and down will increase or decrease the brightness. I think this project is specific to younger people who enjoy having LEDs around their workspace, which I know many people do nowadays. 

  • Visual Alarm

Often times I have trouble getting out of bed in the morning or after a nap. I would like to use LEDs to create a visual to assist the user in waking up. The LEDs will work together with the alarm in the user’s phone to start the “good morning” visual.” when the alarm goes off, the LEDs will turn on, making the area brighter and harder to sleep in, which should assist in waking up. The LEDs will also have other functions, such as a snooze mode. When you hit snooze on the Alarm, the LEDs will stop the “good morning” visual and turn a soft color so the user knows the alarm is on snooze. 

  • Talking Clock

The Talking Clock uses LEDs to display the time and other messages visually. The other messages could be what has to be done later in the day or the weather. But the messages will slide across the screen similar to how they do in banks and train stations. But this will be different as it will be interactive with your iphone. It will include an app with tasks, weather and other things that need to be visually displayed. Essentially, taking the virtual information and visually displaying it via LEDs. 

Sketches:


 

Filed Under: Uncategorized

Neopixel Music Visualization

November 15, 2022 by Morgan Somchanhmavong Leave a Comment

The first step to the Neopixel Music Visualization recitation was gathering all the supplies. As shown in the picture, I needed my Arduino board, jumpers, power cable, laptop, and LED strip. 


I then used the jumper wires to connect the LED strip to the Arduino. I connected them accordingly: 5V to power, ground to ground (etc.)

 

After connecting the LED strip, I tested to ensure it was running by running the example code for fastLED. It worked, so then I proceeded with the following steps. As shown in the picture below, the light strip functioned adequately. 

Afterwards, I worked on lighting up more LEDs on the strip. I did this by using the code we used in class. 

The Arduino Code:

/* This is a code example for Arduino, to be used on Recitation 7
  You need to have installed SerialRecord and FastLED libraries.
  It requires NeoPixel WS2812 at pin 3
  Interaction Lab
  IMA NYU Shanghai
  2022 Fall
  */
   
  #include "SerialRecord.h"
  #include <FastLED.h>
   
  #define NUM_LEDS 60 // How many leds in your strip?
  #define DATA_PIN 3 // Which pin are you connecting Arduino to Data In?
  CRGB leds[NUM_LEDS];
   
  // Change this number to the number of values you want to receive
  SerialRecord reader(4);
   
  void setup() {
  Serial.begin(9600);
  FastLED.addLeds<NEOPIXEL, DATA_PIN>(leds, NUM_LEDS); // Initialize
  FastLED.setBrightness(10); // BEWARE: external power for full (255)
  //further info at https://learn.adafruit.com/adafruit-neopixel-uberguide/powering-neopixels
  }
   
   
  void loop() {
  if (reader.read()) {
  int n = reader[0];
  int r = reader[1];
  int g = reader[2];
  int b = reader[3];
   
  leds[reader[0]] = CRGB(reader[1], reader[2], reader[3]); // Prepare the color information using CRGB( Red, Green, Blue
  FastLED.show(); // Pass the information of color to the LED
  }
  }
 


 

The Processing Code:

/* This is a code example for Processing, to be used on Recitation 7
  You need to have installed the SerialRecord library.
   
  Interaction Lab
  IMA NYU Shanghai
  2022 Fall
  */
   
  import processing.serial.*;
  import osteele.processing.SerialRecord.*;
   
  Serial serialPort;
  SerialRecord serialRecord;
   
  int W; //width of the tiles
  int NUM = 60; //amount of pixels
  int[] r = new int[NUM]; //red of each tile
  int[] g = new int[NUM]; //red of each tile
  int[] b = new int[NUM]; //red of each tile
   
  void setup() {
  size(600, 200);
  W = width/NUM;
   
   
  // You can use this syntax and change COM3 for your serial port
  // printArray(Serial.list());
  // serialPort = new Serial(this, "COM3", 9600);
  // in MacOS it looks like "/dev/cu.usbmodem1101"
  //or you can try to use this instead:
   
  String serialPortName = SerialUtils.findArduinoPort();
  serialPort = new Serial(this, serialPortName, 9600);
  serialRecord = new SerialRecord(this, serialPort, 4);
  serialRecord.logToCanvas(false);
  rectMode(CENTER);
  }
   
  void draw() {
  background(0);
  for (int i=0; i<NUM; i ++) {
  fill(r[i], g[i], b[i]);
  rect(i * W + W/2, height/2, 10, 10);
  }
   
  if (mousePressed == true) {
  int n = floor(constrain(mouseX/W , 0, NUM-1));
   
  r[n] = floor(random(255));
  g[n] = floor(random(255));
  b[n] = floor(random(255));
   
  serialRecord.values[0] = n; // which pixel we change (0-59)
  serialRecord.values[1] = r[n]; // how much red (0-255)
  serialRecord.values[2] = g[n]; // how much green (0-255)
  serialRecord.values[3] = b[n]; // how much blue (0-255)
  serialRecord.send(); // send it!
  }
   
  }

The code connected processing with Arduino so that how the user interacted in processing, the LEDs would react accordingly. It allowed the user to control how many LEDs to turn on and which ones to change colors via the processing canvas. The next step was to get processing to interact visually with music so that we could merge the Arduino and processing code to make the LED react to music. To get the song I wanted, I got the link to the song from YouTube, put it into an mp3 converter, and downloaded the song. I then dragged the file into the processing “Data” file, so it was in the library. 

My code:

#include "SerialRecord.h"
#include <FastLED.h>
#define NUM_LEDS 60 // How many leds in your strip?
#define DATA_PIN 3 // Which pin are you connecting Arduino to Data In?
CRGB leds[NUM_LEDS];
// Change this number to the number of values you want to receive
SerialRecord reader(4);
void setup() {
Serial.begin(9600);
FastLED.addLeds<NEOPIXEL, DATA_PIN>(leds, NUM_LEDS); // Initialize
FastLED.setBrightness(10); // BEWARE: external power for full (255)
//further info at https://learn.adafruit.com/adafruit-neopixel-uberguide/powering-neopixels
}
void loop() {
println(analysis.analyze());
background(125, 255, 125);
noStroke();
fill(255, 0, 150);
// analyze the audio for its volume level
float volume = analysis.analyze();
// volume is a number between 0.0 and 1.0
// map the volume value to a useful scale
float diameter = map(volume, 0, 1, 0, width);
// draw a circle based on the microphone amplitude (volume)
circle(width/2, height/2, diameter);
}
if (reader.read()) {
int n = reader[0];
int r = reader[1];
int g = reader[2];
int b = reader[3];
leds[reader[0]] = CRGB(reader[1], reader[2], reader[3]); // Prepare the color information using CRGB( Red, Green, Blue
FastLED.show(); // Pass the information of color to the LED
}
}

Overall, the project wasn’t terrible. We did a lot of good practice during the lectures, but I got lost during the lectures most of the time. However, I think putting it into practice during the recitation helped a lot. I still have trouble getting processing to communicate properly with Arduino. Merging the code at the end was the most difficult step for me. 
 

Filed Under: Uncategorized

Preparatory Research and Analysis

November 15, 2022 by Morgan Somchanhmavong Leave a Comment

My previous definition of interaction: When there is an exchange of actions, such as listening, thinking, and speaking, between participants. During this encounter, they will react to one another and then act based on their reactions. Based on the reading, interaction is defined as “communication between people through art systems.” This definition aligns with mine as there is an exchange of actions between participants to allow something to occur. 

An interactive LED Beer Pong table is a fascinating project related to mine. This table has LEDs around, but the main visual is in the middle between the cups. It lights up, displaying various exciting patterns and game-related visuals, such as the score. 

Another project is “Music Reactive Floor Lamps” these are Long Led poles that stand from the floor and can be placed anywhere in the room. They react to music similar to what we did during the recent recitation making our LED strip react to the music we played. I think these are both very successful interactive experiences as they use LEDs as the visual/result of the interaction between devices and users. These projects have influenced my work in that I will be doing something similar to where interactions will occur, and then LEDs will be used to show the result of this interaction. 

Citations

Blake, Alan. “11 Stunning Led DIY Projects to Light up Your next Party.” MUO, 29 June 2022, https://www.makeuseof.com/stunning-led-diy-projects/.

Beer Pong Table:
https://www.youtube.com/watch?v=_OS11MW9bxU&t=10s
 
Floor Lamp:
https://www.youtube.com/watch?v=yninmUrl4C0&t=1033s

Filed Under: Uncategorized

Animated Poster

November 7, 2022 by Morgan Somchanhmavong Leave a Comment

For this exercise, I made a poster using processing! I used shapes, circles, and ellipses to create the main image in the center. It’s supposed to represent an eye looking around. The text on the top left then explains all of the information about the IMA Fall End-of-Semester Event.

For the homework assignment, the first task was to create a fixed pattern. I made a series of circles evenly spread out and distributed across my canvas. The background was painted green using fill(), and the circles were colored using the random color function. Since the color of the circles was set to random, every time the code was rerun, the color of the circles would change, as shown in the video below. 

 

This pattern was then taken a step further and made the circles flash different colors in rapid succession. The color switching was very fast and emulated a disco ball. I had a lot of difficulties getting the colors in the circles to switch, not the background. I learned later that the order of the code matters a lot, especially where the fill() function goes. 

 

My code:

void setup(){
size(1024, 768);
}

void draw(){
fill(random (255), random(255), random(255));
background (255);
background(20, 131, 34);

for (int rowY = 75; rowY <= height; rowY += 75) {
drawCircleRow(rowY);
}
}

void drawCircleRow(int rowY) {
for (int circleX = 75; circleX <= width; circleX += 75) {
ellipse(circleX, rowY, 50, 50);
}
}

Finally, I had to take the same code and convert it, so it was interactive. Meaning that a key was pressed when a mouse was clicked, or anything like that would affect the canvas. For mine, I took the same code for the flickering circles and made it interactive by adding the keyPress function to change the color of the circles. This was challenging because the randomization of colors was limited to a singular circle when I added the keypress function. 

My Code:

void setup(){
size(1024, 768);
}

void draw(){
fill(random (255), random(255), random(255));
background (255);
background(20, 131, 34);

for (int rowY = 75; rowY <= height; rowY += 75) {
drawCircleRow(rowY);
}
}

void drawCircleRow(int rowY) {
for (int circleX = 75; circleX <= width; circleX += 75) {
ellipse(circleX, rowY, 50, 50);

if (keyPressed == true) {
fill(0);
} else {
fill(255);
}
circle(25, 50, 50);
}
}

Processing is a good program if you know how to use it properly. We used it for designing and animating a poster which was a fun experience. However, I had a lot of trouble getting the code to do exactly what I wanted. Each number value has its purpose; whether it’s an x or y coordinate or the shape’s size, they’re all essential to understand when processing. 

Filed Under: Uncategorized

Processing Basics

November 1, 2022 by Morgan Somchanhmavong Leave a Comment

Above is the image I used for my inspiration. I decided to focus on the central, large Garfield for simplicity. I chose this image because this one seemed the most doable out of all of the pictures in my camera roll. I wanted to draw the whole Garfield rather than just focusing on a specific body part. I want to draw from top to bottom, starting from designing the head. I started by drawing two ellipses for the head and body. Then I added 4 more ellipses for the eyes. I used two ellipses for the eyes because I wanted to add smaller white ellipses in the eyes to add detail to them. I used a red circle for the mouth/nose since I wasn’t sure how to create the upward shape as shown on the actual Garfield. I then decided to use rectangle shapes with curved edges for the limbs. In the end, I planned on going in and adding the ears via the triangle shape because it was the hardest thing to add. I didn’t have time to figure out how to do it in time, so my drawing doesn’t have ears. 

I feel like the body composition of shapes and colors are similar. Unfortunately, everything else is different. It didn’t turn out how I had hoped, but it gave me more insight into how to use processing. 

The sketch I made was a freehand more similar to the actual picture, and then after realizing some shapes were challenging to make in processing, I modified the sketch into more simplified shapes. 

I’m indifferent on whether or not drawing in processing was an excellent means of realizing my design because I think it’s a good program for drawing however, it’s kind of complicated to use, and I had lots of trouble when drawing my Garfield. 

My code:

void setup () {

size(600, 600);
}
void draw () {

fill(#E3941E);
ellipse(300, 200, 200, 225);
ellipse(300, 350, 225, 230);
fill(255);

fill(#050100);
ellipse(260, 150, 25, 35);
ellipse(340, 150, 25, 35);
fill(255);
ellipse(260, 139, 20, 10);
ellipse(340, 139, 20, 10);
fill(#E0230E);
ellipse(300, 200, 30, 30);
fill(255);
fill(#E3941E);
rect(350, 300, 80, 220, 40);
rect(160, 300, 80, 220, 40);
fill(225);

fill(#E3941E);
rect(200, 400, 80, 220, 40);
rect(280, 400, 80, 220, 40);
fill(225);

}

My sketch:


 

Overall, I had a decently good experience using processing at this recitation, and I feel like I learned many of the basics. The drawing doesn’t look like my picture or sketch because of my coding abilities and not the program. 

Filed Under: Uncategorized

  • Page 1
  • Page 2
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • Study Buddy – Morgan Somchanhmavong – Professor Rudi
  • Image and Video
  • Digital Fabrication
  • Final Project Essay
  • Serial Communication

Recent Comments

No comments to show.

Copyright © 2026 · Agency Pro on Genesis Framework · WordPress · Log in