Final Project Documentation

Title: StyleGan Particle System

Project Video (UE Realtime Gameplay):

StyleGan Latent Spacewalk Video from Runway:

One sentence description: Niagara Particle system in Unreal Engine driven by StyleGan latent spacewalk Video. 

Project summary: 

My project is essentially a way to generate digital art with the help of StyleGan and machine learning algorithms. I am using the Niagara particle system in Unreal Engine with the particles mapping the position and color of the StyleGan latent spacewalk video I generated from Runway. I believe each picture generated by StyleGan or any machine learning algorithm could be a unique artwork or texture. I built three scenes to demonstrate how these machine-generated images with distinctive styles and subject matters with some simple touch could be transformed into powerful and dramatic artworks. My original plan is to control the video input in UE, also the color and position of the particles, in real-time (using the google drive link streaming blueprint in UE). So the viewers could generate their own StyleGan video and their unique particle system directly.   

Inspiration: 

I was definitely inspired by Refik Anadol, who is the pioneer of using A.I. to create mesmerizing and dynamic installation art. A lot of his works were using particle systems. I’m also greatly inspired by digital artists Yuma Yanagisawa and AvantContra. Their works made me think it’s possible to create the effect I want with Unreal Engine, which is why I started learning UE. 

Process: How did you make this? What did you struggle with? What were you able to implement easily and what was difficult?

I am using the Niagara particle system in Unreal Engine with the particles mapping the position and color of the StyleGan latent spacewalk video (playing in loops) I generated from Runway. The viewer is able to walk around the scene and explore the world, and also view the particle systems from different angles. To make the system more controllable I added a key pressed function to play, fast-forward and rewind the latent space. If I have more time, I planned on implementing the unreal texture streaming method to stream video texture directly from the google drive link. I need to also investigate how to upload video directly from either Glitch or Runway to a google drive link with the same file name.

Audience/Context:

The audience is everyone who are interested in this kind of artwork. I believe three of my scenes evoke completely different feelings for the audiences. If I have more time, I would try to make the scene in VR or AR for a more immersive experience.  

User testing: 

I should drive the behavior of the particles with StyleGan. I could imagine the curl noise force or the depth position being driven by the latent space videos, but definitely need more time to figure that out.  

Code references: 

I referred to Yuma and Sem Schreuder‘s tutorial on Niagara. And I made some practice trails for my 100days of making project.  

Leave a Reply

Your email address will not be published. Required fields are marked *