Mo-Sys Engineering has solved a major creative limitation when using an LED volume for virtual production. Cinematographers who needed to pull focus between real foreground objects – such as actors – and virtual objects displayed on an LED wall – such as a deer, far away in the virtual world as in the example in the video below – have been unable to do this as the lens focal plane stops at the LED wall, meaning the deer always remains out of focus.
For the first time, an LED wall can be used as more than just a backdrop and can instead integrate with the real stage. Cinematographers can now seamlessly rack focus deep into the virtual world and create layered images to enhance their visual storytelling, saving time and money by combining shots in a way that they will be familiar with.
James Uren, technical Director at Mo-Sys and inventor of Cinematic XR explains, “Traditionally, the LED wall has been a passive part of the wider set, but we are now empowering cinematographers to turn this feature into an asset that enhances and strengthens the overall story.”
This is achieved intuitively using the same wireless lens control system commonly used in film-making and is compatible with Preston wireless lens controllers (Hand Unit 3 and MDR-3). The lens controller is synchronized with the output of the Unreal Engine graphics, working with Mo-Sys’ StarTracker camera tracking technology to constantly track the distance between the camera and the LED wall.
Cinematic XR Focus is just one of many key capabilities that Mo-Sys is adding to its VP Pro XR, a purpose-built scalable XR server, that enables the use of traditional shooting techniques within virtual productions.
See Cinematic XR Focus in action in the video below: