Errors: I lost all my materials and shaders after migrating the final portal to my level inside the classes project. The shape of the overall portal was also changed. I started going through the blueprints and reconnecting the materials that had errors to no avail.
Update: I came to class early to discuss the issue with Todd. He had me delete my asset from the classes level and upload the portal as a level itself. The portal seemed to work much better, but there are still some connection issues. I will report back after the shaders finish building.
OSC
I had Mac Problems this week, so I worked with Philip to get OSC up and to run. We connected the Bang protocol to activate the shockwave. Something we noticed was the connection between Max and Unreal. There seemed to be a delay; we found out there was a setting that conserves CPU power. Once we shut that off, the connection was live and working well.
This week unreal tested me, and I kept running into errors that led to hours of youtube tutorials, documentation, office hours, and bothering classmates. Things finally started to come together at the end of the week, and I’m getting to a place where I’m happy.
Problems I’m still having:
When I press, play the camera gets shot into space, and I can’t move.
My laptop keeps crashing – I ended up working on the production computers on campus.
I don’t fully understand the error “Texture streaming pool over budget.”
Successes:
I isolated my world to a small portion of the map – I was getting overwhelmed trying to fill the map.
I was able to import a height map of Adirondack Park to give my world shape.
I love how my new world is shaping up and am excited to continue working on it.
Week 1 – We’re diving into Unreal Tutorials and cleaning up our computers. I am still in the process of building my PC, so I’ve borrowed a laptop from the ER for the time being. After some tidying up, I have gotten the computer up to speed.
This semester-long project was in collaboration with Philip Cadoux, Pauline Ceraulo, and Sean Zhu. We were tasked with using actual research to create an interactive and educational museum installation.
The project is based on the research entitled “Centennial response of Greenland’s three largest outlet glaciers” by David and Denise Holland of the Courant Institute at NYU and the Center for Global Sea Level Change at NYU Abu Dhabi.
Our installation focus was to introduce the viewer to general topics of how this complex research is conducted through climate models, chaos theory, and tipping points.
Activation 1: Double Pendulum – Demonstrating the fundamentals of chaos theory.
Chaos Theory – What do viewers need to know?
One action can have a large-scale, snowballing effect on the larger system.
There are patterns in the chaos that we can use to estimate whether changes/events.
Glacial melting is a phenomenon closely tied to weather and chaos theory.
Production of Double Pendulum:
Orginal CAD drawing
Playtesting
After building a full-scale finished piece, we discovered a flaw in our design. We needed more bearings to reduce friction, a tighter fit for our bearings so they wouldn’t shift while rotating, and finally, we needed to install shoulder bolts instead of regular bolts to reduce friction again.
The second round of construction more bearings and a more precise fit.
Finishing work.
Finishing work II.
Final Piece:
The final piece has a battery-powered Arduino attached to each arm of freedom. The Arduino uses an accelerometer to detect orientation and sends the information via Bluetooth to our site, presenting the viewer with their own chaotic system drawing. The viewer can capture the drawing on their phone via a QR code that pops up.
Drawing Examples:
Activation 2: Tipping Point
Our goal is to demonstrate the seriousness of tipping points. Once you reach any of these thresholds, the system can be pushed into an entirely new and potentially irreversible state.
This activation is a carnival-influenced web game similar to High Striker, where the player uses frantic button pressing against a difficult system to score points. The game can be played here.
For this project, we were tasked with creating a live sculpture using AR or VR. I chose to wrestle with Unity and Vuforia to produce an AR experience revolving around two of my favorite things in NYC natural spring flowers and convenience store flower stands.
As much as I don’t like using single plastic, there is something beautiful about the flowers wrapped in plastic and the plastic curtains draped in front of the stores. These dying plants are getting preserved as much as possible. I don’t know; I’ve always been drawn to them.
I wanted to use the idea of plastic wrap and preservation to “protect” the flowers blooming on the trees around Brooklyn. I also wanted to experiment with the flowers at the convenience store.
First, I set out to photograph the flowers.
Back home, I took these images and made collages in Photoshop. To complete the image I added a layer of “plastic” wrap final layout.
Next, I built my scene in Unity. The goal for the VR experience was to have dead flowers trigger the sculpture. Luckily I had dead flowers in my apartment. Once triggered, the screen would be filled with the sculpture. I wanted the user to have to rotate the phone and look around to experience the whole piece.
Problems faced:
Getting the Unity build onto my phone. For some reason, every time I exported the build, it would make a folder, but the folder is empty.
Lighting on one of the faces. The sculpture works, but the lighting on one of the planes is always dark. I was unable to figure out that problem.
I thought the trigger would still be activated on the actual live flowers if it was lined up the same way as the image. That experiment failed.
In the future:
After I built this, I think it needs a sound element. I should record the ambient sound of the neighborhoods. I’d also like the trigger to be the actual dead flowers, not an image of them.
For this week’s assignment, I decided to explore image deterioration and I coupled that with the topic of rain forest destruction. A slightly heavier topic than I’ve been exploring in the last couple of weeks.
The original image of a healthy forest is easily exploded and destroyed by any movement of the mouse; the user must take time and focus to line the curser back to 0,0 to put the whole image back together again. Below the exploded image is another image of the forest post destruction. The user can blend the two images by getting the mouse close to 0,0.
Code:
int R, G, B, A; // you must have these global varables to use the PxPGetPixel()
PImage ourImage;
PImage ourImage2;
void setup() {
size(1500, 800);
frameRate(120);
ourImage= loadImage(“https://images.takeshape.io/852126ce-462c-40d9-ae71-1a2c96e82d8f/dev/2925f6d3-c6c3-42e5-89d9-8cf91ed3d28f/Kluet-peat-swamp-forest%2C-Leuser-Ecosystem.-Photo-by-Paul-Hilton%2C-RAN.jpg?auto=compress%2Cformat”);
ourImage.resize (width, height);
ourImage.loadPixels(); // load the pixels array of the image
ourImage2 =loadImage(“https://media.wired.com/photos/59372bbfd80dd005b42b626f/master/w_2560%2Cc_limit/AP4997094644081.jpg”);
ourImage2.resize (width, height);
ourImage2.loadPixels();
}
void draw() {
background (139,0,0);
image(ourImage2,0,0);
loadPixels(); // load the pixels array of the window
for (int x = 0; x<width; x++) {
for (int y = 0; y<height; y++) {
PxPGetPixel(x, y, ourImage.pixels, width); // get the RGB of our pixel and place in RGB globals
int destinationX = x*mouseX;
int destinationY = y*mouseY;
PxPSetPixel(destinationX, destinationY, R, G, B, 255, pixels, width); // sets the R,G,B values to the window
}
}
updatePixels(); // must call updatePixels oce were done messing with pixels[]
println (frameRate);
}
// our function for getting color components , it requires that you have global variables
// R,G,B (not elegant but the simples way to go, see the example PxP methods in object for
// a more elegant solution
void PxPGetPixel(int x, int y, int[] pixelArray, int pixelsWidth) {
int thisPixel=pixelArray[x+y*pixelsWidth]; // getting the colors as an int from the pixels[]
A = (thisPixel >> 24) & 0xFF; // we need to shift and mask to get each component alone
R = (thisPixel >> 16) & 0xFF; // this is faster than calling red(), green() , blue()
G = (thisPixel >> 8) & 0xFF;
B = thisPixel & 0xFF;
}
//our function for setting color components RGB into the pixels[] , we need to efine the XY of where
// to set the pixel, the RGB values we want and the pixels[] array we want to use and it’s width
void PxPSetPixel(int x, int y, int r, int g, int b, int a, int[] pixelArray, int pixelsWidth) {
a =(a << 24);
r = r << 16; // We are packing all 4 composents into one int
g = g << 8; // so we need to shift them to their places
color argb = a | r | g | b; // binary “or” operation adds them all into one int
pixelArray[x+y*pixelsWidth]= argb; // finaly we set the int with te colors into the pixels[]
}
For our final assignment, we’re working in Node-Red to create a workflow with sensor data collected over the semester.
Earlier this semester, I set up a simple Arduino run device in my apartment that collects temperature and humidity throughout the day. This week in Node-Red, I created a workflow that sends a message alert using MQTT to get over 70 degrees. Below is a screenshot of my workflow and the text message I received notifying me.
I’m really excited to have gotten this system up and running. I can see these types of databases’ power and have started brainstorming new applications to implement them on.
I had a great time designing this poster for the ITP Winter Show. My original sketches morphed significantly, but I embraced the shift and came up with something I’m really excited about. I also learned that taking a step back in the design process and relaxing can up you up for new ideas. I struggled with what to put in the center of the poster playing with images and shapes, but nothing worked for me. It wasn’t until I took a break to watch an episode of The Great British Bake Off that the globe and wire idea came to my mind. Without finishing the show, I headed to my computer to finish the design.
Design Breakdown:
The background consists of random patterns that I drew and then pixelized. The primary purple color is the official NYU purple that I pulled from the school’s style guide. The other purples are variants on the NYU hue.
The globe in the center is wrapped in a thin wire. I was thinking about our program’s diversity and how we all can’t be together to share our work. The work is literally coming from all corners of the world. Technology (represented by wire) is bringing and holding us together.
I searched the Adobe font site for a futuristic style font that I think worked really well with the background pixelation.
After going over the week’s tutorials, I felt confused. I understood the fundamentals but couldn’t think of a project to do. I re-watched the tutorials and searched the web for some inspiration but was blocked. Finally, I just started sketching in my notebook, reminding myself to think of a simple program that I could test things out. It’s not the most exciting script, but I came up with a thatch type grid with ever repeating colorful ellipses.
Here is my primary sketch.
After a great deal of trial and error, I was able to hobble my way to a completed sketch based on my drawing with a few modifications.
I wanted half of the ellipses to go up and a half to go down.
I wanted the lines to turn around once they’ve reached the opposite side.
Because I tried so many different approaches and saved them separately, my final code became really confusing, with oddly named variables and useless code.
It was important for me to work on this alone to figure out what I really need to work on. This proved to a very frustrating challenge.
Below are the sketches I saved throughout this process.