Democracy in virtual production is causing an evolution
in the technology. As VFX becomes a greater part of the industry, virtual production attempts to fix the growing divide between what filmmakers can see through the camera on-set, and what they have to imagine will be added digitally many months later.
Definition magazine takes a look at how these new tools work and interviews Michael Geissler to find out more about the challenges the virtual production industry is facing and how new advances in technology from Mo-Sys are helping overcome them.
Mo-Sys’ Cinematic XR initiative is aimed at driving change in the image quality of LED panels. Purpose-built for cinematic and broadcast use, and designed by expert engineers, the solution improves final-pixel Unreal Engine image quality.
Geissler explains: “There are two extremes of Unreal graphics quality. In final-pixel LED volume shoots, you sacrifice Unreal image quality for immediacy. That is, you can’t turn the Unreal quality dials up without dropping below a real-time frame rate. At the other end of the scale, post-production compositing enables non-real-time rendering with all the Unreal quality dials at maximum, but at the expense of time and cost. Mo-Sys’ new NearTime rendering combines the immediacy of final pixels with graphics quality approaching offline compositing, stretching rendering time in a patented and automated workflow.”Definition September 2021