Week 5 – VR/AR homework about the Oculus Connect 6 Developer’s conference

For the Oculus Connect 6 Developer’s conference, I choose to watch the 12-minute summary and two videos, Creating Spatialized Music for VR/AR and A New Architecture: Unity XR Platform. It is amazing for me to see that Oculus Quest is going to import hand tracking without using hand controllers, it is a great progress towards making VR experience more real, as how we use our heads in VR vision is not going to be controled by using a controller as we press button on it or swip our fingers on its touching pad. So by realizing this can imitate the detail movement of our hand, like how our joints has bent, how much strength we are using to realize more complicated movement and effects in VR applications.  For the Passthourgh+, I think it is also important and convenient as you do not need to take off the headset to see what is happening around, which makes using the VR headset much safer in case of any circumstances. Applying facebook society in Oculus is making it more interactive as users can share their experience and hold event in real time, and it is more conveient than in the reality to invite someone to play with you as everyone just need to wear the headset and join others online which feel like being together. The idea of facebook Horizon is also amazing as it is a highly interactive world which is described by Zuchberg that users can do anything they want in it. I believe this must be a great achievement as any open world, not only in VR, but also on other platform is extremely hard to realize, because there are always limits for the users to manipulate the open world, for example, if users want to create something, they want to cut something into a customized shape takes high demand to the phsical engine, not to mention the VR open world. To make it real but highly controllable is really amazing to me. The Ctrl-lab that is working on neural control system is the one that fascinates me the most, it is incredible to control user movements by just thinking with a wristband on, I have no idea how this works in detail, but instead of planting devices inside the body to make this work, just using a wristband is far beyond my expectation. Combining this with the hand tracking system, it is possible to foresee that we even don’t need a menu when using VR headset, as we can do the movements by hands and send commands by just thinking, these all make VR a second world to live in. For the machine perception part, how real the VR scene looks like and how precise the human face and body can be captured and displayed are both amazing, I  think it is a progress towards making the VR world more real, instead of only using cartoon figures to communicate.  

For the first video, the Unity one, has mentioned three points, Unity XR integration, API convergence and Universal Render Pipeline. The Unity XR integration has mentioned that many assets and scripts has been packed into pachages to make it easier for developers to download whatever they want by choosing the corresponding packages. But what interests me is that developers can use VR to do 3D modeling and rendering, instead of using mouse to drag objects on PC, this time, it is like being in the Unity to manipulate the objects, if it can also be applied with the hand tracking system, it will be amazing that we can shape the objects by hands to whatever shape we want and drag them to wherever we want. For the API convergence, it is about cross-platform APIs, that the API in VR headset can be applied to other applications even though they have different interfaces, it is very helpful to eliminate many problems when developers are switching from platforms to platforms. For the Universal Render Pipeline, I am quite confused how it works, but I thing I learn is that, it uses tile renderer which is quite expensive, so it is important to do the best rendering effect at the ultimate stage to save unnecessary computing cost.

For the second video, the spatialized music it talks about the preocess of making music spatialized in VR. Unlike traditional headlock stereo music which is very plane, Quad mixing sound can be heard from every direction and is world-relative, and there is always a compass to determine where the sound comes from. Ambisonic mixing sound follows users around and is also world-relative. The way how videos with spatial sound are created in a different sequence then usual, that music is created first then the visuals. Also, overall exeperience are made by unreal engine, while wise which is a game-oriented sound design tool is made to realized the sound effect, and music is placed inside the enviornment. It is totally new for me to learn about how music and visuals are combined in VR, because it is totally different than traditional graphical music on PC, take games for example, even though players can hear where the sound comes from and even how far they are but it is not in an immersive environment, not matter how real it feels, it cannot be compared with a VR one. But it is obvious that to realize spatial sounds is much harder on VR.

The following is the photo I took along the Huangpu River when I was riding along the river side, the reeds artfully cover the buildings behind them with the beautiful sunset.

Leave a Reply