The late, great director Alexander MacKendrick once said that the director’s job is to direct attention. Indeed, the ability of the director and DP to lead the viewer’s eye to the important part of the frame is the very foundation of movie-making.
One of the major tools in the DP’s toolbox is the ability to shift that attention by pulling focus and moving the visual plane to a new part of the picture.
Today’s emerging virtual production technology has highlighted how important such techniques are. DPs moving from traditional film production to the new virtual frontier have discovered certain limitations and innovators have been working to solve those overcome those limitations.
One such problem comes with the use of LED volumes. Photo-realistic graphics on multiple LED panels surrounding the real objects in the scene can look amazingly life-like but until now they have been mere backdrops, not able to become fully enmeshed with the scene in a way a real location can. Camera tracking means you can move around the foreground objects while maintaining perspective and parallax, as showcased recently on The Mandalorian.
But what could not be done, until now, is a focus-pull into the computer graphics. The LED volume was essentially a single fixed plane, so once the focus reaches the plane of the LEDs, there is nowhere to go. Until now.
Mo-Sys Engineering, using some ingenious and complex technology, has solved the problem. Mo-Sys first rose to fame for creating the camera robotics used in ‘Gravity’. They have garnered further industry acclaim for the Mo-Sys StarTracker camera tracking system, which tells the CG computer exactly where the camera is in three-dimensional space. It gained plaudits as it combined lens data with real-time positioning, meaning perspective and parallax can be recreated in such detail.
Mo-Sys has now added the ability to seamlessly pull focus between the real and virtual scene, whilst continually monitoring the camera’s position relative to the LED wall. The feature is called Cinematic XR Focus. Technically it is definitely not trivial: it requires considerable extra processing of the signal. The Mo-Sys VP Pro software, which is fully integrated with the Unreal graphics engine, has to simulate the change of focus in real time.
Most importantly, this solution has been developed with a practical use in mind. Focus pullers instinctively use Preston wireless lens controllers, set up for the specific lens mounted on the camera. The Mo-Sys solution takes the control signals from the Preston and allows it to move both the real and virtual focus plane up to the LED wall, then it seamlessly shifts to controlling just the virtual focus plane in the virtual graphics scene.
DP Brett Danton has tried the system on a test shoot in the UK. “This system opens up the XR image to work like a 3 dimensional scene by adding the last element depth,” he said. “Previously you had parallax but now you can focus through the screen, giving far greater freedom in creativity where the scene is reacting as if shooting on location.”
James Uren, CTO of Mo-Sys, said “There were a lot of problems we needed to solve, like moiré patterning between the grid of the camera sensor and the grid of the LED walls, as well as freeing the DP to move the camera and the focus puller to manipulate the lens, simultaneously, in real time for final-pixel shoots.
“But clever technology is no use if it gets in the way of the creativity,” Uren added. “So we made sure it works the way that people are used to, with the equipment they are comfortable with. The Preston controller reports focus as it happens, whether that is in the lens or in the computer. This software extension sits inside our graphics manipulation software, which in turn sits inside the Unreal graphics engine using nDisplay to drive the LED volume – the set-up used by most in this sector.
“We are giving an important part of the creativity, of the language of production, back to the people who are actively demanding it.”