For this week, my project partners Fanni, Simi, & Tian worked on our final project proposal for Bodies in Motion – Mirrored Self.
Mirrored Self, is an exploration of the self versus society.
In this piece, mirrors represent the window between physical freedom and virtual constriction.
By facing a mirror, the performer must face themselves. The audience is given the agency of when & how this happens.
We want to create a visceral experience for our audience about moral limitations & ethical concerns about violence in the virtual world.
Mirror Mograph Deformer, made with C4D
This week in Lab, we learned how to use a new Motion Capture System: Perception Neuron.
The benefits of this is that it is much cheaper, faster, & uses less nodes with no fear occlusion! However, the main drawback is that the motion fidelity captured is much lower.
Also, we added some new materials & to our scene from last week! I had some trouble baking my old animation, so I edited the skeleton with some primitives to create the following:
This week, we recorded 3 scenes in the Blackbox studio – climbing, swimming, & floating.
Fanni the Swimmer
Using that MoCap data + Cinema4D, here’s some documentation of the animation that I made:
However, when I tried importing this into Unreal, I ran into many many problems. First I tried using the C4D-Unreal plugin. However, this imported my character as a static mesh.
Then I tried baking my animation & exporting this as FBX file for Unreal Import. Received the following errors:
Changed some of the hierarchies & parenting in C4D & tried reimporting. Success! Kind of…..Unfortunately, the animation & skeletal mesh are not completely paired.
C4D Build an Animated Dynamic Character
Worlds on a Wire Week 6
UE4 How to Import Skeletal Meshes
This week, we took MoCap data & retargeted it onto an avatar to be used in Unreal. Here is the final result:
First I started by making an avatar with Adobe Fuse.Then, the avatar was impoted into Autodesk Maya for skeleton rigging.
Using the bridge between Maya & Motion Builder, I then attached the live mocap data from this week’s lab onto the avatar.
Here’s some documentation of our data capture & cleaning process from Lab!
For this weeks, we used Starter content to create a basic scene from our inspiration board. Character inspiration can also be found there.
Here is the main image that I used as a reference.
Here is my attempt at recreating it. The key things that I am missing from this is a better material + lighting setup.
Also this week, we learned how to do live Motion Capture in the Black Box Studio!
For our week 1 lab, we calibrated the Optitrack motion capture system at NYU. Here is the general procedure we followed:
- Before Calibrating: Make sure any kind of reflective material is not in the room. If any reflective materials are seen on cameras (represented by red dots), select “Mask Visible.”
- Start Wanding: With the reflective wand, make landmowing & spiral motions to collect data points. Make sure that every camera has a sample of 10,000 data points.
- Click Calculate & apply results! Results should say “exceptional.” The cameras now know where they are in relation to each other.
- Ground plane: The ground plane may be tilted. Set the ground plane using the reflective ground plane tool then click “Set Ground Plane.” Then put the ground plane away:
- Rigid Bodies: Now we can use the unique reflective rigid body structures to record some basic mocap movement! Make sure to select & left click your points on the screen to “assign rigid bodies.”
- Now we can record a take & import these movements into Unreal Engine! Make sure that the server address, connection type, & naming conventions are matched.
Here is my inspiration board for environments & effects that I would like to implement: