RAPS Final Project: Wild (Tina)

Title

Wild失控

Project Description

Our project’s name is Wild. We got the inspiration from the movie Blade Runner. In the movie, artificial intelligence is designed to be nearly the same as human, it is really hard to distinguish the difference between humans and artificial intelligence. They are built by humans, and they are stronger than humans. When they come to fight with humans for limited resources, humans get hurt by the creature created by ourselves. Just like the first project I have done in the interaction lab, we designed a short scene play which described a situation that there would be a memory tree in the future which people can upload and download memory and knowledge from. But when one is too greedy and get too much from the memory tree, he is hurt by the memory tree. In this project, we want to present human’s over-dependence on technology. The technology is a double-edged sword, it can make our life more convenient but when we depend on it too much, it will hurt us on the other hand. The “Wild” is named after the scene in which all objects as well as the music lose control and go wild.

Perspective and Context

Our project is consists of three parts. The first part is the music, Joyce made the music using garage band. Because we wanted to make a contrast between how people rely on the technology and how would the situation be when we lose the control of the technology in the whole project, so we decided to make the first part of the music more peaceful and mild, and at the turning point, the music crashes. It turned crazy and wild. In this way, we create a connection between music and the visual scene. And the second part is about the Kinect. I got the inspiration for using Kinect to create the interaction between the performer and the project from this contemporary ballet dance.

https://www.youtube.com/watch?v=NVIorQT-bus&feature=youtu.be 

I really love how the dancer’s movement influence the objects and projection next to him, and at the same time, the objects influence him as well. Rather than just simply controlling, the performer he himself has become a part of the project, he and the project become as a whole. Therefore, I decided to put myself into the performance. I originally planned to use the depth function in the Kinect to control the movement of the object in the max, but because my computer is Mac and many functions including detecting the depth can not be used. So, at last, I just use the shape capture to capture the shape of my body. And in this project, I use my hands to receive the 0s and 1s, which represents that humans can receive digits information through technology(because the technologies we have developed till now are all based on the binary system). The digits disappear where I am, but later on, with an explosive sound in the background music, the digits lose the control and randomly move on the screen.

The third part is the max patch. In the beginning, we want to design a changing object from regularly changing to crazily changing, in order to represent the control-losing. But later we found it was too boring and dull to just use a single object. So we decided to use more videos to enrich the content. We use a scene of a bustling city as the beginning of our project to indicate the development of the city, then there are green 0s and 1s dropping from the sky, the whole city is under the control of digits. By this, we want to show that the city is running with the help of technology. And then we use several short clips of how we depend on technology in our daily life. Then the main object appears. It changes and moves regularly according to the music. Later, we add the scene from Kinect, and with a sudden glass-broken sound, we cover the whole screen with the white canvas very fast, which represents the exploiting point. After this, our core object goes wild. It turns from smooth to poignant. The whole scene goes crazy. We add all different videos to show the situation that when we rely on technology too much and it crashes.

Development & Technical Implementation

patch code

In the group of me and Joyce, basically, she finished the part of the music and I was in charge of the Kinect and processing part. And then we developed the max part together. Through this process, I learned to connect different ideas together to create a more diverse project, and I really enjoy cooperating with her.

   
While dealing with the Kinect, one biggest issue we met is the data transmission from Kinect to the computer. The amount of the data is too big for the adapter from USB to TYPEC, so when we try to connect Kinect through the adapter, there was nothing on the screen. We tried several different adapters, but none of them can transfer the data from Kinect to my computer. There was only one that works. So at last, we switch the midi mix to the Kinect during performing, and after the part of Kinect, we switch back to the midi mix.
Developing the max patch is the most challenging part. According to Ana Carvalho and Cornelia Lund (eds.), the VJ is just playing the videos and music that already exist. We want to avoid using too many video shots from the real world. Thus, I created videos of smog and random numbers by processing.

The code I use for connecting Kinect with processing and Max

The code I use to create random numbers and smog scenes

To create an adjustable object, we decided to use jit.gl to create a 3D object immediately. We learned from the tutorial on the Youtube, and then improve it. Performance 

It was the first time we went to the club for performing. We got a bit nervous at first, but we found it was normal and everyone got nervous. During the performance, I was in charge of the movement of the center cube while Joyce switches the videos and the music. Basically everything smoothly, but before starting, the processing could not work normally, it kept reporting the error. I guessed that it was because we have run both the processing and max for too long, so we restart it immediately. Luckily, it worked. And in the middle part of the performance, the music became a little loud. Such unexpected things happen a lot during performances, so learning how to deal with them calmly is needed for every live performer.

Another important thing is to organize the patch in a clear way. It is dark in the club, therefore, when we want to use the MIDIMIX to control the patches, it is hard to see what we write on the tag. So assigning the notes on the MIDIMIX according to the arrangement of the objects on the patch is important.

At the end of the performance, when we adjust the position and shape of the cube in the center, it did not change as we practiced. We planed to only have the circle rotating, but there was a small yellow cube in the center, which make the project seems even better. So, although we have no idea why it would appear here, we are still satisfied with it.

Conclusion

I think this is an interesting project which broadens my eyes, I learned to produce music by garage band and I explore the use of Kinect. But there are a lot of things we can do to make the project better.

First, about the Kinect, the first thing is I want to try to use p5 or runaway to capture the movement of the body because I found that the control stage was not as dark as I thought. Or I can try depth and other functions of the Kinect on a windows laptop. And I also want to turn the Kinect to the audience so that there will be more interaction. Maybe an underground club is not a very proper place for these kinds of interaction, but this is a fun way to think from to develop the project.

Second is about the max part, we still use too much real scene videos in our project. I think there is more we can do on the patch itself. For now, our developments on the objects in the max simply rely on the functions inside the max. And it is hard for us to make a controllable and beautiful object in the max directly. I asked my classmates about their way of using max, I found that both of the two projects I love the most have used the .js. Therefore, I want to try more different ways of combining the max with other resources to create a more diverse project.

One biggest issue of our project is that we are trying to convey something “meaningful” to the audience, but actually there is no need for us to do so. Just like what we learned at the beginning of the semester, the synaesthesia varies because everyone has different life experiences and thinking. Therefore, their feeling toward the same piece of music may be different. The meaning of art is the same. The meaning of an art piece can vary from person to person, and it doesn’t need to be educational. We can jump out of traditional thinking and create something more crazy and abstract. So, in the future, rather than just put the street views on the screen, I want to discover more forms of presenting the idea I think. Also, while designing the music, we should consider more about the place we are going to present the project is a club, so the music with stronger beats and rhythm will be more suitable. 

In all, even though there I think the connection between our music and visual art is strong, and audiences’ feedback tells us that we have made an exciting live performance.

Work Cited

Live Cinema by Gabriel Menotti and Live Audiovisual Performance by Ana Carvalho from The Audiovisual Breakthrough (Fluctuating Images, 2015).

Reading Response 8 – Live Cinema (Tina)

Live audiovisual performance is a much more border concept than VJs and live cinema. It includes the concept like “expanded cinema” as well. It is a popular form of art now, people can see it at every event and ceremony. Live cinema is not like the ordinary cinema which conveys the information to the audience unidirectionally, but it takes audiences’ feedback at the same time. Both the live audiovisual performance and live cinema have the characteristics of “live”, which means they are unpredictable, brings more excitement and surprise in the performing. 

From the reading I feel that the author thinks Vjing is inferior and has less creativity since he talks about “We imagine that VJs would be more like commercial directors for hire, paid by the hour, dealing with second-hand material, willing to follow popular trends. Meanwhile, live cinema creators would occupy a place equivalent to that of film auteurs, whose goals “appear to be more personal and artistic.” ”

Vjing is creating the relationship between viewing and hearing, while live cinema is creating a conversation between those who play the live cinema and the audience. 

As for me, I feel that VJ also involves a lot of creativity, the leading way of making the relationship between video and audio changes all the time, so VJ brings uncertainty as well.

Audiovisual live performance:

Vjing

about live cinema: 

Assignment 5: Multi 3D Objects (Tina)

github link

3D resource

I originally wanted to make a scene with different types of desserts, but after adding the first cake model, I found it was really hard to get the correct color I wanted to apply. I tried to use the mappr and other color-controlling patches, but the results weren’t very satisfying. 

Then I think the color-changing “cake” looks 1more like instead. Therefore, I add a 1EASEMAPPR as the background, and make the “cake” as well as some randomly changing lines move on the top. I will try to go further in discovering how to adjust the color and moving mode as I plan.

 

The video: 1

The reason why this assignment is so challenging is that it is really hard to understand the very specific function of each statement. I go through the patch we have done in the class, also I look at some sample codes. Through this I understand some of the functions. In the following weeks, I really should spend time exploring more, and I think this part will be very useful in the final project.

Assignment 4 – Granular Synthesis Audio (Tina)

my Github link: https://gist.github.com/tz1201/337908f1b39d4ee023d893ea7d78e5c8

I want to create a video that seems to be “arms” that drives the other two videos. therefore
As for adding the effect to the videos, because I think these videos are very dynamic, so I firstly eliminate the idea of using DELAY, then I consider using zoom and rotatr to mix the videos together better.
As for the music part, since only the very short parts of the videos will be played, I want to create a very rhythmed sound. I use drum sequencer to play the sample music, one drum beats as the base layer, and adding the other electricity sound as the decoration.

the video: Granular Synthesis Audio

Things to be improved:

I originally wanted to use the video “arms” as the basic layer that fill the whole screen, and then add the second layer of the video which is clearer onto it. But when I added the forth video to the Max, my computer crashed, and the color of the video changed into red. I restarted several times, but it still remains red.

Reading Response(Tina)

The practice of Vjing is the realtime mix of content from a “library of media”, but VJ is not just playing the videos and music, it needs the VJ to play to the score. According to Tordino, “Improvisation is the basis of VJing performances” (108). VJing is And because of the environment, the conditions vary from time to time, there doesn’t exist two same VJing performances. This is the liveness, uniqueness, and the improvisation of VJing, it is not reproducible. It is the same with the drama and the musicals, every time the actors give us a slightly different performance, the songs, the positions, the emotions are very unique. Liveness is very important in attracting audiences. This summer I went to see an imersive light performace by cocolab, the light was gorgious, and it is really beautiful and impressive. But all the sound as well as the light performace were recorded and played by machine, so when I saw the second time, I felt a little bored and wanted to leave. On the contrary, I am willing to pay to watch the same musicals several times because I can expect different performaces each time.
VJing is the way of describing the performative use of generative software. It concretes the music and other formats of art, immersing people in the performances. The “visual wallpaper” reminds me of the experience of seeing when I went to the club last time. I did remember the VJing performance I saw, but the problem was that I could not recall what the performance exactly was. My attention was attracted by the dancer on the stage as well as the glaring light above me, the VJing performance just served as a “visual wallpaper”. Too many things to distract my attention, therefore, I can not focus on what is going on on the screen.
One thing interesting is that I remembered in the video, almost all VJs have emphasized the significance of having the “knob”. Artists need the process of adjusting the knobs rather than just clicking on the computer keyboard. At first, I thought there is no big difference between using the MIDMIX and just clicking on the keyboard, but the fact is that the knob is like a distinct line drawn between VJing art and simply playing the video.