Sunday, March 26th, 2017

ILMX Lab and VR

0

April 5, 2016, GPU Technology Conference, San Jose, CA—Lutz Lotta from Lucas Film talked about the ILMX lab where work progresses in their VR developments. The creators of Oscar-wining media and technologies are finding that VR is very challenging.

For most media, the key aspect is the story, which requires artists and technologists to work together to recognize, and push boundaries in storytelling and technology. They started with typical pipeline and render flows from their VFX work and are moving towards more virtual reality content. One challenge is that the traditional VFX pipeline has to be transformed to the VR mode with its 360 degree stereo views.

For example, they created a VR short called Jakku Spy featuring BB8 as a vehicle to explore short-form media in VR. This project furthers the immersive experience by using real-time rendering similar to a game engine, so they can generate content on the fly rather than having to pre-build the entire universe at once. In creating the short, they were able to use pre-visualization that closely matched the final video images.

Some of the work was done over the Web on an iPad to look at different views and different frame rates for some of the clips. They found that the storytelling is still the same, but the technology allows the viewer to have some control of the motion and views. They showed some scenes from the latest Star Wars film and talked about using many of the existing assets from that movie as components in a VR interactive game.

The lab is also working on some augmented reality projects, but are waiting for a "real" headset before moving on too far. They are learning to combine motion capture with other footage and achieving a different real view. Their technology allows them to build 3-D assets that incorporate over 4 M polygons while generating 400 GB of data.

Some of their work is to reduce the asset sizes to more manageable volumes, since they want to convert movie assets into games and use standard asset decks across all tools. Their work enables all shading models to be similar in VR and rendering for projection. The underlying methodologies use standard lighting in all tools for the various views. Individual functions like calibration, color grading, etc. and texture optimization change since the details in VR can be reduced when the objects are at a distance.

For example, a light saber needs to have full detail all the time, since it may be held close the head at times. However, these real geometric details can cause aliasing at some working distances, so the assets have to have compromises to adjust for that aliasing. Also, the blade and motion trails become too short if they are using the standard physics in a VR frame rate of 990 FPS, so they have to add motion blur to enhance the trails. There is a significant difference between the 24 FPS of cinema and the 90 FPS used in VR.

The hardware for this work used a multi-GPU custom environment, with custom designed hardware and software. The render takes advantage of the multiple GPUs by doing alternate frame render, which allows interleaving GPU threads to improve latency. They tried split-frame render but found it too complicated for most systems. It is possible in their custom system through manual scheduling of the GPUs. Split-frame render has a much lower latency than full-frame rendering. The VR content is output as SLI and uses one GPU per eye. They encode the frames as 4-2-2 with about 0.6 ms for processing time and an additional 0.1 ms for compression. Future work will attempt to automate the job scheduling for the interleaved VR images.

They found that some of the visual effects have to change for VR. The high frame rate can cause poor dynamic lighting and shadow effects in VR. Frame lighting is preferred with no dynamic shadows. The VR mode allows for many visual enhancements and better dynamic details. The biggest difference is that it is better to use static light on the details for pre-render and don't render objects with no changes.
 

Speak Your Mind

Tell us what you're thinking...
and oh, if you want a pic to show with your comment, go get a gravatar!