Week 3: VR Experience Review – Alex Wang

The 3 VR titles that I have experienced are Imagine Dragons – Whatever it takes (a VR music video), street performers in Washington square park, and BBC HOME – a VR spacewalk.

VR Music Video:

The VR music video for the Imagine Dragon song was very interesting. The graphics were computer generated, and is the kind of visuals that I wanted to experience in virtual reality. Computer generated graphics are cool in VR because even though they are unlike the world we see in real life, but by adjusting the images on both sides to match the disparity of our eyes as if we are viewing real objects, computer graphics can look real in the sense that it can fool the eyes to also feel a sense of depth even though it is really only 2 dimensional. This is a really cool feeling as if the user is in a cartoon world.

Street performers in Washington Square park:

Just like the Imagine Dragons video this is also a music based VR video experience, but unlike the previous VR title this one is not computer generated, it is filmed with a 360 camera. I really liked the idea of these kinds of videos because they take you to somewhere else using the VR technology, and have you experience something as if you are really there. However, I did not really enjoy this video because it was poorly filmed. It seems like this was a amateur video, the camera and musicians are positioned in a way that is uncomfortable for the audience. The video and audio quality is also bad, but that might be because the video was streamed rather than downloaded.

BBC – Home , VR Spacewalk:

This was the best VR experience I had out of the 3. It was very well made, the graphics looked pretty real for something that is modeled in the computer. But the best part about this experience is that the theme of the project actually made perfect sense to live in the VR medium, because VR is the only way for the viewer to feel like a real astronaut. If this was just a ordinary game/video, there would have been no way for the audience to be engaged and have a immersive space experience. Near the end of the game, a small explosion happens and the audience is out in space spinning. The visuals alone was able to get the player to be disoriented and feel nausea.

Reflections:

I think all the VR titles that I have experienced are interesting in their own ways, the power of VR to give the user a experience that can not be achieved through any other forms of media is very impressive. I also realized how much work goes into the filming of VR videos, that they can be really hard to get right without the proper understandings of how the fundamentals work.

MLNI Week 3: P5 OOP Generative Art – Alex Wang

Task:

Create a generative art that utilizes object-oriented programming concept and interactivity with mouse and/or keyboard.

Final Product:

Progress:

I started by creating a class called spark, basically a object that displays as an ellipse. Created a simple physics system where the balls location is updated according to its velocity and acceleration which are all variables unique to every object.

Initial code:

var array = [];

function setup() {

  createCanvas(400, 400);

  array[0] = new Spark(1,200,0,1,0);

}

function draw() {

  background(220);

  for (var i = 0;i < array.length;i +=1){

    array[i].gravity();

    array[i].motion();

    array[i].show();

  }

}

class Spark {

  constructor(iteration,posx,posy,vx,vy){

    this.iteration = iteration;

    this.posx = posx;

    this.posy = posy;

    this.vx = vx;

    this.vy = vy;

    

  }

  gravity(){

    this.vy += 1;

  }

  

  motion(){

    this.posx +=this.vx;

    this.posy +=this.vy;

    if(this.posx <= 0){

      this.vx*=-1;

    }

    if(this.posx >= 400){

      this.vx*=-1;

    }

    if(this.posy <= 0){

      this.vy*=-1;

    }

    if(this.posy >= 400){

      this.vy*=-1;

    }

    

  }

  show(){

    fill(255,0,0);

    ellipse(this.posx,this.posy,40/this.iteration);

  }

}

Then I wrote a function called burst(), where the ball will split into two upon collision with walls.

The code instantly crashed, and I lost all my work because I haven’t saved on the web editor. At first I thought it was because splitting was too much calculation for the computer, but it turned out I was creating a new spark object within the spark class, which is problematic. So I created a function called burstt() outside of the class, and use the inner burst to exchange information for the outer function to perform the actual bursting operation.

After getting the splitting to work, I added parameters to the spark class, with a iteration variable indicating which generation the spark is currently at. I had the sparks change color and shrink in size according to its current generation, and also set up conditions for the sparks that are greater than a certain iteration to stop updating.

Finally I decided to add more interactivity to the sketch by switching the burst function from triggering upon collision to triggering upon mouse click. I also changed the direction of gravity from pointing down to pointing towards the mouse. Which ended up looking very visually pleasing and fun to interact with using the mouse.

Video:

Final Code:

var lastLoc = [];
var array = [];
var maxN = 100000;
var counter = 0;
function setup() {
createCanvas(500, 500);
array[0] = new Spark(1,200,0,1,0);
}

function draw() {
background(20);
for (var i = 0;i < array.length;i +=1){
if (array[i].alive()){
array[i].gravity();
array[i].motion();
array[i].show();
}
else{

}
}
}

function mousePressed(){
//spawn new spark
//lastLoc = [];
//array = [];
//maxN = 1000;
//counter = 0;
//append(array,new Spark(1,200,0,1,0));
var temp = array.length;
for (var i = 0;i < temp;i +=1){
if (array[i].alive()){
burstt(array[i]);
}

}
}

function burstt(obj){

//console.log(lastLoc,counter);
if (counter < maxN){
obj.burst();
append(array,new Spark(counter+1,lastLoc[0],lastLoc[1],random(-60,60),random(-60,60)));
counter += 1;

}
}

class Spark {
constructor(iteration,posx,posy,vx,vy){
this.iteration = iteration;
this.posx = posx;
this.posy = posy;
this.vx = vx;
this.vy = vy;
this.color = [(random(250)/iteration),random(255),random(50)*iteration]

}
burst(){
//console.log(“hi”);
lastLoc = [this.posx,this.posy,this.vx,this.vy];
this.iteration+=1;
this.color = [(random(239)),random(255),10+random(8)*this.iteration]
this.vx = random(-4,4);
//append(array,new Spark(this.iteration +1,this.posx,this.posy,1,1));
}
gravity(){
this.vx *= 0.99;
this.vy *= 0.99;

if(this.posy < mouseY){
this.vy += 2;
}
else if (this.posy > mouseY){
this.vy -=2;
}
if(this.posx < mouseX){
this.vx += 2;
}
else if (this.posx > mouseX){
this.vx -=2;
}
}

motion(){
this.posx +=this.vx;
this.posy +=this.vy;
if(this.posx <= 0){
this.vx*=-1;
this.posx=1;

}
if(this.posx >= width){
this.vx*=-1;
this.posx=width-1;

}
if(this.posy <= 0){
this.vy*=-1;
this.posy=1;
//burst(this);
}
if(this.posy >= height){
this.vy*=-1;
this.posy=height-1;
if (this.iteration < 23){
//burst(this);
}
}

}
show(){
if (this.iteration < 23){
fill(this.color[0],this.color[1],this.color[2]);
ellipse(this.posx,this.posy,100- 4*this.iteration);
}
}
alive(){
return this.iteration <= 22;
}
}

Reflection:

I really like the concept of OOP, as it is a great way to organize code and is really convenient to work with for the purpose of creating things such as generative arts, or video games.

The biggest challenge I had during this project was having the program crash every time I try to split the object, the program does not treat it as an syntax error, it just keeps running and crashes.

I think the idea of changing the direction of gravity according to mouse position was a very nice change, it was really easy to implement since I have already created the physics system, and it looks very visually good. Because it somewhat follows the physics of real life, but it is at the same time impossible in real life because nothing has such a strong gravity like the one I gave the mouse. So the interaction feels really natural but new at the same time.

MLNI Week 2:ML/AI Case Study – Alex Wang

AI Technology in music production

Intro:

lately I got really interested in music production, and I realized that there are many new technologies available for music production that is actually powered by machine learning! 

Phases of music production:

There are multiple phases in the production of a song. A composer writes out a song, a producer then creates it using a digital audio workspace, then a audio engineer mix and masters the track to perfect its dynamics and clarity.

The role of an audio engineer:

Audio engineer actually plays a really big role in the making of a song, it is not required for them to have any knowledge in song structure, music theory , instrumental skills, etc. But they are the ones that makes the song sounds perfect without creating anything themselves. A audio engineer can spend his whole life perfecting his skills as it is a very complicated job.

AI as audio engineer:

AI is very good at replacing humans in areas that does not require creativity, but requires skills. So music mastering is actually what people have been working on to let AI replace. You do not need to create anything, but you need very good ears and experience to perfect the song. 

I was at a speech by Kai fu Lee here in NYUSH a year ago, and he talks about the types of jobs that are easily taken over by AI and the ones that needs more human elements to it. Audio engineers, even though they are really respected and takes years to optimize their skills. They are still a job that does not require creativity and human compassion, which is why they are currently a target to be replaced by technology.

Landr/Ozone:

existing services like Landr and Ozone is already using AI as a tool to master the track for you. These convenient services, along with the advancements in making DAW and samples more accessible, it is actually really easy to get started in making professional sounding music nowadays.

https://www.youtube.com/watch?v=43Uad9C6LeQ

Articles:

WILL AI REVOLUTIONIZE THE WAY ARTISTS MAKE MUSIC AND FIND SAMPLES? LANDR SAYS YES…

Landr raises $26 million for AI-powered music creation tools

“Original” work with python Neural composer + Izotope Ozone mastering:

I made a song using melodies composed by a python network, I did the producer part of the creation process, while Ozone AI did the mastering of the track.

Reflection:

Over the past few years I realized how AI is capable of not only replacing human in physical labor, but also be an amazing resource in the world of arts. Even though they might not yet be capable of replacing artists, they are now a very strong tool in the assistance of making quality products. They are closing the gaps between a professional musician with a expensive studio and a whole team of producers/audio engineers along with professional gear, and any ordinary person with a laptop.

Week 1: 16 Lessons for a VR-First Future – Alex Wang

After reading the 16 lessons for a VR-First Future, I gained a lot of insight on what VR technology is capable of in the future. I agree with most of the points that was made in the article but if I were to pick my favorite/most agreed point, it would be that VR will play a bigger role than AR in the future. Which I believe many people would actually disagree on.

Most Agree: 2. VR may play a bigger role in our future lives than AR

I believe that AR is a very promising technology, and will play a bigger tole than VR in the near future as soon as its technology matures. However, I believe ultimately VR will surpass AR because there are many limitations to AR technology. There are qualities discussed in class about how it is very complicated to achieve realistic AR due to how our eyes perceives light. Another reason why I believe VR will play a bigger role in the future is because of the possibility of a brain-reading technology. As humans learn more about neural science, there are already researches in reading the brain using techniques such as tracking blood flow. Once these kinds of technology is refined, it is possible for VR to go to a next level of immersiveness.

Most Disagree: 5. Virtual Schools will democratize high quality education to the world

The author argues that VR can give learning resources more accessibility, and I do not think that this is true. I believe that the spread of knowledge is already very accessible through current sources such as online video platforms like youtube,  where some would even argue is a better source than traditional school education. however, VR can help in some aspects of education in terms of experience. Being able to witness certain things with a high level of immersion can enhance the students experience on the subject. Examples being watching historical events, scientific experiments(there are already VR applications that simulates the creation of chemical molecules), and even experience live music. These experiences are all very valuable, and is all made accessible through the VR technology. 

Thoughts on mirror world:

I think the mirror world is a very interesting concept, and I believe it does a great job in capturing the qualities of augmented reality. I also am very excited about the endless potentials that a refined AR technology can do. The amount of information and utility that it can bring to our lives are endless.

Three catalogs I experienced:

Music video by imagine dragons

live music in washington square park

nasa space experience

thoughts on the VR experience at IT:

I do not personally own a VR headset, so this experience was very exciting for me. I was able to feel what it is like an astronaut in space, swim in the ocean with sea creatures, and even watch a live performance in Washington square park. This session was very helpful to me in terms of giving me an idea of what VR is capable of, and also what types of projects were successful through this medium.

MLNI Week 2: P5 Drawing Functions – Alex Wang

Task:

Develop a simple sketch utilizing display and transformation functions in p5. This exercise is to practice the functions we have learned.

Final Product:

Progress:

I first started this assignment by creating a basic rectangle that rotates around the center, I then improved it by creating two more rectangles, each with opacity and different rgb values for the strokes and fills.

initial code:

stroke(0,10,200,90);
fill(0,0,255,90);
translate(width/2,height/2);
rotate(radians(frameCount));
rect(0-50,0-40,100,80);

translate(-width/2,-height/2)
stroke(200,200,10,70);
fill(255,255,0,70);
rect(width/2,height/2,100,80);

stroke(200,0,0,70);
fill(255,0,0,70);
rect((width/2)-100,(height/2)-80,100,80);

I then wrapped the whole code with a push and pop function, and made that its own function called spin().

after creating the spin function, I used a for loop to call the spin function multiple times. Increasing its speed every iteration of the loop.

I can iterate the for loop as much as I want, so I can also make a very complicated flower. This is what the code looks like for 20 loops

The color is random, but it keeps updating and flashing since the random value for the color is generated within the draw() loop of the p5 code. I want the colors to be random, but stay as it is. So I created a array list at the top of the code to store the random values, I also limited the range of randomness so that the flower can be variations of the same color

code:

var alist= [];

function setup() {

for (let num = 0; num< 100;num+=1) {
r = random(205);
g = random(30);
b = random(255);
append(alist,[r,g,b]);
}

this is a blue flower, I decided to make it more visually appealing by expanding the range of red rgb value, to add a hint of purple.

Final Code:

var hi = [];

function setup() {

for (let num = 0; num< 100;num+=1) {
r = random(205);
g = random(30);
b = random(255);
append(hi,[r,g,b]);
//console.log(hi);
}

createCanvas(400, 400);
background(0);
stroke(0,10,200);
strokeWeight(20);
fill(0,0,255);
ellipse(width/2,height/2,100,100);

}

function draw() {
//console.log(frameCount%100);

background(10);

push();
//strokeWeight((frameCount%100));
//radians 0,pi
//degree 0,360

//stroke(10,200,10,80);
//fill(0,255,0,80);
//ellipse(mouseX,mouseY,80,80);

stroke(0,10,200,90);
fill(0,0,255,90);
translate(width/2,height/2);
rotate(radians(frameCount));
rect(0-50,0-40,100,80);

//pop();

translate(-width/2,-height/2)
stroke(200,200,10,70);
fill(255,255,0,70);
rect(width/2,height/2,100,80);

stroke(200,0,0,70);
fill(255,0,0,70);
rect((width/2)-100,(height/2)-80,100,80);

for (let num = 0; num< 20;num+=1) {
//r = random(255);
//g = random(255);
//b = random(255);
//spin(num,hi[num][0],hi[num][1],hi[num][2]);
}

}

function spin(x,r,g,b){
push();
//strokeWeight((frameCount%100));
//radians 0,pi
//degree 0,360

stroke(r,g,b,90);
fill(r,g,b,90);
translate(width/2,height/2);
rotate(radians(frameCount*x));
rect(0-50,0-40,100,80);

translate(-width/2,-height/2)
stroke(r,g,b,70);
fill(r,g,b,70);
rect(width/2,height/2,100,80);

stroke(r,g,b,70);
fill(r,g,b,70);
rect((width/2)-100,(height/2)-80,100,80);

pop();

}

Reflection:

I think it is cool how simple it is to generate cool looking visuals through code. The random element generated by the computer looks very natural to me, even more natural than the choices made by an artist. I had some trouble with p5 syntax, but I was able to find really convenient documentation of p5 syntax on its website. I am looking forward to learning more and creating more in p5 and ml5.