The Future of Virtual Production

Jan 28, 2021

The Future of Virtual Production

.. and Filmmaking with Holographic Display

Guest blog by Mo-Sys partner company VividQ

The gaming industry was one of the first early adopters of new immersive technologies, including virtual reality (VR) and augmented reality (AR). The shared use of CGI, VFX and game engines for content creation meant that game developers could easily pivot to creating new groundbreaking experiences for consumer audiences. With the increased use of CGI and VFX in movies, filmmaking is another creative industry starting to realise the great promise of immersive technologies. In AR display and tracking, in particular, recent developments are likely to set the path for virtual production to take over filmmaking as we know it today.

“If everything moves along and there’s no major catastrophe, we’re heading towards holograms.”

Martin Scorsese on the future of filmmaking (2011)

Within the traditional filmmaking process, directors and film crew employ a series of processes to decide the project’s scope at three points: before the film is produced, while the film is being created, and during the post-production phase. However, with the rise of new technological developments, directors are increasingly using ‘virtual production’ to mix live-action footage and computer-graphics at once. In an episode of the Unreal Engine ‘Visual Disruptors’ podcast, David Morin, Head of Epic Games LA Lab and chairman of the Virtual Reality & Virtual Production Committees, defined virtual production as “the process of creating a digital world which starts at the inception of a movie and ends with the final visual effects, centred on a real-time interaction on set.”

Using virtual production techniques, directors not only have creative flexibility to make quick, on-set decisions while in production, but have an entire process for creating a virtual world from scratch. In the pre-visualisation process of a film’s production, a director or creative team can view a specific scene’s visual renders. By visually mapping out the scenes using virtual production techniques, directors and set designers have more control to make decisions that can impact their desired content’s overall style and mood.

Introducing Computer-Generated Holography into the creative process

By engineering light, computer-generated holography (CGH) enables a user to experience digital content that appears at different distances. These holographic images are inherently three-dimensional and can be projected through a head-mounted device (similar to the Microsoft HoloLens), or holographic AR smartglasses.

So how can this next-generation display technology be deployed in the movie industry? Imagine the following scenario. A film director and set designers work on a new film and are in their project’s early pre-visualisation stages. The crew are all wearing holographic AR smartglasses to visualise the virtual set to collectively make crucial decisions in advance of building the physical set. The physical space they are in serves as a creative blank canvas, and the set designers can easily scan the room and map out all the surfaces they can use. The director can place, move and customise highly realistic holographic objects around the space in real-time, taking important information such as lighting and the actors’ placement into consideration. Normally, the director would need to commission new props or set designs at this stage, costing valuable time and money. However, by using holographic AR smartglasses, they can make smart performance-based decisions to accurately tailor their shots on set, eliminating the need for lengthy post-processing or expensive production reshoots.

There are a few reasons why CGH meets the requirements for virtual production to be successful:

Accurate depth of field

Depth of field – the apparent defocus between virtual objects – is hugely important when discussing any long-term consumer or enterprise AR use cases. Without it, virtual images in AR do not integrate naturally with the real world and appear disconnected from its surroundings. The viewer needs to be able to focus and defocus their eyes depending on when the digital content is placed, for example between the foreground and the background. Today, most 3D and AR devices rely on stereoscopic displays, which not provide viewers with any depth cues. Those stereo virtual objects lay on a single focal plane, causing a feeling of disconnect and, often, nausea.

In virtual production, depth is significant for ensuring accuracy when placing virtual ‘props’ around the scene. AR devices using CGH guarantee precise environment matching thanks to advanced tracking systems where the digital content can appear at any depth plane. Integrating tracking systems into AR wearables means users can experience unlimited freedom of movement and flexibility around a scene with full-depth. Tracking technologies, such as the Mo-Sys StarTracker VR, allow limitless, multi-user free-roaming experiences within AR and VR environments. StarTracker’s technology uses an inside-out system, which allows the device to find its position in space by tracking small reflective stickers in real-time. This feature allows virtual objects to be tracked and placed at accurate depths over a large physical distance in real-time in a virtual production scenario. Virtual objects can be ‘docked’ in physical and virtual space, meaning they will remain frozen in place when the user looks or walks away. Accurate and reliable placement of virtual objects allows a director to see exactly where the eventual film viewer will be focusing their attention.

Realistic Image Quality from a variety of 3D Data Sources

When we view the world around us, we are actually observing complex patterns of light reflected from objects. Our visual systems interpret these complex light patterns as the objects we see. Using holography, we can engineer the exact complex light pattern that is reflected off a real object and display it to an observer. The engineered light pattern is visually indistinguishable from the real object. Computer-generated holography creates holograms digitally from various 3D data sources such as game engines, depth-sensing cameras and CAD software. Therefore, this method uses software to create a light pattern of an object without using a real object at all. This allows the creation of dynamic and interactive holographic virtual images.

Using software tools for 3D creation and design such as Unity and Unreal Engine, holography offers an easy way to create, use and customise high-quality virtual objects in real-time. This allows directors and their creative teams to have more freedom to make set design choices on an open canvas.

CGH also benefits from superior brightness, a wide colour gamut and high image contrast compared to other currently available AR headsets, producing high-quality holographic images that are visible at all lighting levels with ease and comfort.

The future of filmmaking

As holographic displays are integrated into the next generation of consumer devices, we will see the creation of complex virtual objects, scenes and characters that can be directed in the same way as real actors. In the future, CGH will be used not only for virtual production purposes but for storytelling itself. The dream of a holodeck will one day become a reality, inspiring the next generation of directors following Scorsese’s vision.

The future of virtual production and holographic filmmaking is nearer than we think. Thanks to the collaboration between VividQ and Mo-Sys, we get closer to bringing in a revolution in the film and entertainment industry.

Click here to learn more about our partnership with VividQ.

Click here to the VividQ whitepaper ‘Holography: The Future of Augmented Reality Wearables’.

This article was originally posted on the VividQ website here.

Get the latest on Mo-Sys

Enter your email to get the latest news from Mo-Sys Engineering

2024 © Mo-Sys Engineering Ltd. All rights reserved
Start Typing and press enter to search