What is Virtual Production, and why would I use it?
Mo-Sys Engineering’s Philippe Vignal uncovers the exciting opportunities and what brands need to understand about virtual production.
For decades we’ve been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalise the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process. Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient. But it requires more preparation.
The enablers for virtual production are real-time photo-realistic graphics engines (e.g. Epic Games’ Unreal Engine, or Unity’s 3D engine) and precision camera and lens tracking. Using these technologies means that we can put talent or products into any photo-realistic environment we like, whenever we like, even change the environment at a flick of a button.
Here’s the bottom line. Brands, both big and small, are currently looking at ways to reduce cost and increase the speed at which they are producing content, because human eyeballs have become insatiable and require being fed on a daily basis.
Flying all over the world to different sets across numerous locations and using a plethora of resources as well as a large roster of talent is no longer cutting it. The majority of brands want to move away from this model. And it’s virtual production tools that are paving the way.
Technology advancements mean that we are now able to immerse talent in previously impossible environments, or environments that would be very, very expensive to build with physical sets. And we can achieve this in a highly seamless fashion, in a very cost effective way through virtual production.
No longer tied to the laws of physics, you don’t have to chase that golden hour any more for the perfect shot. You can have that golden hour last all day if you need it to. You can have true flexibility.
In traditional VFX production, you create your finished 3D scenes or assets ready for post-production. In virtual production your finished 3D assets are created in pre-production, ready to be used in full quality for production, meaning that post production compositing is either removed completely or significantly reduced and optimised. The net gain is that the overall time used to create the finished commercial is reduced.
Above all, virtual production is a craft that allows brands to tell their stories in new and enticing ways. And as more and more creators adopt the technology, now is the time to get to grips with what it entails and understand exactly how you can harness this tool as a brand.
Bring the Set to Life with LED Volumes
Virtual production is all about real-time VFX, and this can be done using a green or blue screen studio, with no screen (for augmented reality), or by using an LED volume which is the latest rage across the feature film and TV sectors.
Shooting in an LED volume involves capturing in-camera visual effects (ICVFX) shots of talent or products shot against photo-realistic 3D graphics displayed on the LED wall. The virtual graphics displayed on the wall accurately change in real-time providing the correct perspective view to match the tracked camera and its lens settings.
Everyone is asking whether they should be shooting in an LED volume instead of on location. But we must be careful – it is not a cure all.
Sometimes, location shoots are meant to be the solution. Sometimes shooting in an LED volume is going to be the solution. Sometimes you’ll need to combine an LED volume shoot with a green screen shoot. So brand marketers need to work with content production facilities that are able to offer that set of tools and the knowledge that comes with properly using those tools for virtual storytelling.
When you’re on set shooting virtual content on an LED volume, there’s always a compromise between the complexity of the virtual scene and the power of the workstations and graphics cards that are having to render it in real time so that the camera can capture the VFX shot in-camera in real-time.
Technology has certain limitations with regards to the ability of graphics cards to calculate in real time, so in order to maintain real-time playback frame rates on set, sometimes you need to compromise by reducing the quality of your virtual elements. So background elements that you are immersing your talent into aren’t going to be as high quality as you would want them to be, which is exactly what our patented NearTime rendering system is designed to solve.
During a real-time virtual production shoot with NearTime enabled (LED or green screen), as soon as a take starts, the camera tracking data and lens data are automatically sent to the cloud where the same virtual scene exists on multiple servers.
Using the tracking data, and with all the quality dials now turned up, the virtual scene is re-rendered and returned to set or to wherever shots are stored. Here, it is combined with the separated talent or product and the composite shot made ready for approval. The whole process is completely automated and costs for re-rendering are minimal. Importantly, NearTime enables a real-time virtual production shoot, with higher quality images, to be delivered in the same time frame window.
Shoot for Global Markets from Just One Location
One of the biggest benefits of virtual production is that you don’t have to be on location for shooting content. Not only that, but you can very easily change your virtual location background and use 3D Unreal based elements to help in versioning work.
Take brands such as P&G or Unilever who often need to create large amounts of versions for one product. With virtual production, they can shoot a piece of content for their European markets in one background, then change the location and product label virtually to shoot for their Asian and American markets – all from one place during the same shoot.
You can also change the actual product – the packshot – and map on different local labelling, perhaps using the same actor to do multiple language versions through artificial intelligence.
Suit All Budgets
With developments in tech, virtual production is now more accessible than ever. We now see certain phones built with the capability to LIDAR scan real objects to import into a 3D environment. Only a few years ago, this required very specialised tools but now it’s contained inside of a handheld device. This accessibility of tech has brought the cost down noticeably.
This is also true of software. We’re now well versed in software that used to be reserved for the gaming industry – things like Unreal and Unity game engines that have become ubiquitous in the sphere of advertising, feature and broadcast content production.
You can now also generate your own avatar to be immersed into virtual environments and interact with real elements. This would have taken a lot of production work a few years ago, and now it is much more readily accessible to a wider audience.
With the onset of this advanced technology packed into handheld devices, creators like bloggers have even started to adopt virtual production to combine with real-time motion capture. This is made possible through products like Mo-Sys’ StarTracker Studio, the world’s first all-in-one virtual production system for green screen production which comes with all the necessary cameras, trackers, software, and hardware to do virtual production, even where studio space is limited.
So, How Should A Brand Prepare for A Virtual Production Shoot?
At the base of all brand communication is creativity. And at the base of all virtual production is creativity. That will never change. Virtual production is not a cure all or a way to cut corners. If your content is devoid of creativity, it will be as barren in a virtual production environment as it will be in a live production environment. So you need creative people on your team that are technologically versed in Unreal and who understand production.
We are now seeing an increasing presence of hybrid profiles – creative people who also have technical knowledge. They understand Unreal and how it can be used in a production environment. They understand film or broadcast. They need to be curious enough to have trained in film and storytelling in these sorts of environments.
It is also important to remember that virtual production is a team sport. You cannot achieve it on your own, just as you cannot achieve live production on your own. It is a craft. Brands need people that understand virtual production in the same way as they need people on their teams that understand digital marketing. They both address the increased speed of content production needed.
Work with highly trained, highly skilled, curious individuals that can help you understand what’s entailed and who will help you liaise with either your own production departments in-house, or with your production and creative agencies.
Whatever you do, be well informed. Do not rely blindly on external knowledge. Everyone needs to be well informed of what the technical requirements are with virtual production. Work with consultants such as the ones we have at Mo-Sys who can help you navigate this new world of storytelling and technologies.
We are seeing a rapid adoption of virtual production tools. In London alone, there are over a dozen large LED volumes being currently built. We’re going to see increased use of virtual characters, avatars, AR content, entertainment, hybrid use of real-life humans interacting with virtual humans.
People are curious by nature, and we will always try to explore new things. Right now we’re fascinated with being able to reproduce ourselves as meta humans in the virtual world. Where that’s going to take us I don’t know but I do hope we keep our common sense whilst doing this and not forget our humanity and that the real purpose of this is to connect with other humans, not to isolate ourselves from the rest of humanity.
Curiosity combined with storytelling will be the focus over the next thousand years or so.
Read the full article published by Little Black Book March 31, 2022