Game Engines Move to Real-time Video Tasks
April 5, 2016, GPU Technology Conference, San Jose, CA—Rora Van den Bulcke and Thomas Soetens from Immersive Design Studios talked about the use of GPUs and projection systems as media sources for very large are displays in public venues. The latest capabilities enable public VR.
VR is interesting for artists, but should not require a headset in public spaces. There is a large gap between the capabilities of personal headsets and real world public or group video functions. The ability to fill full 360 degree space with audio and video is possible with multiple projectors and GPUs.
Now, game engines have progressed to the point where they can perform visualization and virtual staging, and a single high-powered PC can replace a room full of servers and specialized software. The custom media server setup can cost over $3 M and consists of multiple boxes and very complex software to offer fairly low-end large-scale video.
Immersive's Canvas can bridge the AV requirements of public displays and the real-time, on-demand functions of a game engine. Canvas integrates game technology, capture capabilities, and other functions into a single package. The overall system combines game engine image generation and video playback with a full video pipeline to mix full video and VR assets.
The systems are already in use for corporate and sporting events. In Mumbai, a virtual showroom on a 15x35 foot screen shows interior and exterior views of rooms for sale in the newest tower building. The high-resolution images scale 1:1 with the setting, so potential buyers can walk around in their space and see themselves in the room. The display is running at a resolution of 3 times HD at 60 fps. The system allows the developers to connect retina quality VR images and place the buyers in the scene and environment. The system supports multi-channel audio to enhance the sense of immersion into the mixed real and VR environment.
In sports and entertainment, an enhanced system with 12 projectors produces crowd-level VR. The NHL has a system in the Montreal Bull Center that projects up to 240k lumens of high resolution images on the ice and other surfaces. The system can be gen-locked with other video to allow mixed, real-time images on the ice. For example, the system tracks the Zamboni as the paintbrush to paint portions of the virtual images (think of the circle-out transition in Powerpoint). The images can be further animated after the Zamboni leaves the ice.
Other NHL sites are using up to 20 projectors and the Unreal engine for both in-game and between period displays. The system enables live rendering from the game engine to the projection system. Another benefit of the single PC model is that the developer can pre-install the content and preview it on a standard VR headset.
The capabilities of the latest GPUs now enables a much simpler ability to project live images on any surface. Expect to see many more corporate-level events that use this combination of live video and VFX from a game engine that renders the images in real time.