My initial idea is to make a virtual live show and try to bring it to life.That’s what I have achieved and what I haven’t achieved.
1.Virtual live show-Audio interaction environment
2.
I want to add a performer and some VJ effects,but I haven’t figure
out how to make a real live show in Unreal.I use Max to control the light ,but I dont know how to show the max patch window in Unreal.I recorded it in Greenscreen studio and thanks for Bowei’s live code music and shader.
The effect I want should be like this.But I cannot run all of them at the same time.It makes my PC super slow.
3.VJ effect exercise
max patch
4.I try to bring it to life and made a small pepper’s ghoast installation.
In teams of 2 – 4 create a design to show on Zoom that utilizes tactics of illusion to artistically engage with a concept that is rooted in the cultural or sociological or philosophical dilemmas of representation.
No specific technological tools are prescribed or required; however it must be able to be shown in class on Zoom. Suggestions of topics can be: Underrepresented stories, designs that expose the worship of the written word, histories or cultures at risk of deletion, the power structures behind our systems of representation …
I am not concerned with sexual assault itself. I think there are more social workers/related professionals that are more qualified to talk about this matter than me.
What I am concerned about is that some victims of sexual assault will have hallucinations and beautify the abuser as someone she loves to rationalize what they have experienced.And this most happens when their worldview is not fully formed.
I want to use body projection or mixed reality projection in a physical space to express violence is violence,not love.
I don’t want to discuss this idea in a very serious/heavy way.
I am not sure whether I can find some artistic and relaxing ways to express that.
3.Algorithm traps us in information that we agree with.
Algorithms always recommend things for us based on our preferences, and gradually we no longer hear the opposite views.
There are a lot of academic theories(Echo chamber,filter bubble…) about this, but I think interactive visual things can give people more intuitive feelings.
I did some research.There are still many controversies in academia on this matter.
But personally,I do think that my background limits my understanding of many things.
For the technical part-
I want to combine virtual world and physical real world.
I hope I can try some mixed reality things to project in a confined space.
Or body projection.
Or pepper’s ghost.
2nd
The relationship between personal boday and public topics
Visual references:
male gaze————
What people talk and their hiden meaning under the word
Last semester,I made a paragliding virtual experience. I connected Unreal and Arduino at that time. And this time in our spring show in Shanghai,I hurrily moved it to VR mood. To my surprise,our users said that they can feel a strong sense of weightlessness.I did nothing for that.The sense of weightlessness is entirely caused by their visual senses in VR.
I LOVE the feeling of flying in real world.This kind of sensory experience similar to the real world in the VR world arouses my interest.
2.(25s——40s FPV FLYING EXPERIENCE)
I bought a DJI FPV drone.It can take me to fly in the real world from the first point of view.But the problem is that most cities in China ban the use of drones.Every time I want to use it, I can only drive to the remote field.I tried their simulator app,it’s kind of basic.
I am thinking I can make a more interesting flying practice map ,like in a factory or something like that.
3.Cause Mario Kart 8 is my favorite car racing game,I hope I can fly in that kind of scene.
I am going to make a scifi factory and the main goal is to set up some obstacles for practice flying.
Reference:
、
Timeline:
w9.4.1-4.7:Research,racing track scene sketch and structure
w10.4.8-4.15:material and details,lighting
w11.4.16-4.22:Handle control and interaction
w12.4.23-4.29:interaction and music
w13.4.30-5.5:polish
Problems:I hope to reduce the user‘s dizziness.Is there any ways?
The big led video wall used in Virtual Production.
Cylinder LED screen with roof,270 degree,75 feet in circumference and 20 feet high
Function:
to show the virtual scene /light in real time.
to let the director/photofgraphy/producer know the real time effect of filmmaking.
Properties and qualities:
It can show very detailed light and shadow changes without scattered light spots.
(I am wondering how to make sure that technically.
I checked out the video wall company’s website (https://www.unilumin.com/ )and there are lots of products. I am confused about how can we choose the suitable one.How do we read the technical parameters.
I was inspired by TextRain(1999) Danny showed us in Pcomp class and I decided to try to interact with text.
I want to use images of people to interact with words that make them feel frustrated.
This time I focus on 20-30 year old Chinese female around me.And I collected the most frustrating sentence for them on social media.And what they mentioned most was “How old are you? Why are you not married?”Because in China, many elders think that young girls should get married as soon as possible, otherwise it will cause many problems.
So I want to use the text”你都多大了?怎么还没结婚?”,means “You need to get married soon”.
My initial idea was to let text interact with each other in the physical engine,but unfortunately I couldn’t import them into the physics engine.
So I can only complete two parts separately.
1st Dirty Words Mirror:inspired by text rain(1999)and Jason’s Chinese character mirror
This is my final version Dirty Words Mirror Final,I decided not to preset content and let the user input their own content.
What I want to express is that those words that made you depressed made you who you are today.But too much upset word will make you ugly.So don’t care too much.
I followed Shiffman’s tutorial to make a rectangle sketch in physical engine and I decided to develop it to a kaleidoscope.I also add the orange circles in the sketch, because I think they look like Hula Hoop.
This is the preliminary version of my final project, I completed two steps this week.
1.
I inspired by text rain(1999) on Danny’s pcomp class and really want to make a Mulan poem text rain.
Mulan poem is so popular in China that everyone can recite it.
1st .I made this version by mistake and make every poem line as an individual part.
They look too dense and also kind of dull.Then I changed them into the format I originally wanted.
2nd.
This is the format I wanted at first.But I quickly realized that every single Chinese character doesn’t make sense. Mulan poem is so familiar to us that we should think of the whole sentence by mentioning any word in it.
3rd.So I manually separated some words and sentences with delimiters.I think now when people interact with it, it feels more make sense.People can choose what they want to interact with.
I wonder if this is the reason why computers cannot completely replace humans to deal with text.
2.I am very interested in physical engine.I followed the coding train and wrote a code for matter.js.
I am thinking about whether text can be used in matter. js,because I hope my Mulan poem can be used in it.
It looks matter . js now can only work with shapes.But in my sketch in p5 this week,I made a file “box.js” and used it in sketch.js.I am thinking about is it possible for me to put my text in another “.js”file and use it in physical engine?
I collaborate with Clover this 4weeks to make an AE animation.
Our story is:A squirrel crosses the mountains and rivers to give food to the bear, and finally finds that the bear is holding his favorite nut.
I am in charge of the scene part, I try to make the scene look like time change.
Clover is responsible for the character animation part.We found that to make animal movement real, we need to paint more details than imagined.
I think AE is not suitable for co-editing a project, we have many difficulties in the process of transferring data.It would be great if we could edit in a shared folder.