Producers: So You’ve Booked A Virtual Production? Here’s What’s Next

Here’s what’s next…

Mo-Sys Engineering’s product marketing manager Stephen Gallagher talks to LLB about the key differences and how to train for a future of virtual production.

Virtual production offers tangible benefits to producers. From real-time creative freedoms, to accelerated speed of production, cost savings and logistical efficiencies, there are plenty of reasons to opt for a virtual production.

Take the all-important ‘golden hour’ at sunset or sunrise – in the real world that time is naturally limited, so producers have a short window of opportunity to get their shot and with that comes immense pressure on the entire crew. And for the producer in particular, delays translate to cost implications when things go wrong. In the virtual world, you have complete control, so the magic of golden hour can last all day long should you need it. Rain, wind and clouds are no longer the unpredictable enemy but creative choices. You might even opt to use an unreal scene from out of this world. Similarly, if you have scenes shot in the middle of the night, you no longer need the crew on set at 2am. You can recreate that look during normal working hours, meaning virtual productions gives you far more control of the environment.

Additionally, there’s a massive health and safety and risk assessment benefit. If you’re shooting a night-time scene with cast moving through woods, at height, underground or near water, you can now do that in a comfortable and safe studio environment for the crew and the talent. Which is appealing and can help productions secure first choice talent.

And what about when things go wrong and you need to reshoot a scene? Well, if you’re filming on a boat travelling along the river, moving the boat back to the start to film again is a huge drain on time and money. But on a virtual set, you simply go back to the previous scene with the click of a button.

All of these real-world complications can be solved with virtual production, making it a more efficient and affordable shoot.

So, what are the first steps that need to be taken to prepare for an upcoming virtual production?

One of the main differences between virtual production and traditional production is that you shift a lot of the load from post-production compositing into pre-production. By doing this prep work, you reduce the cost burden of post-production.

So, the first step is to ensure that all the key people involved, including the producer and director, get together much earlier on in the process to make essential decisions. Pre-viz and Tech-viz will increase in order that VFX shots can be captured in real-time, commonly called In-camera Visual Effects (ICVFX). But the savings in reduced post-production compositing time, reduced VFX errors, reduced takes, and minimised travel for cast and crew, more than compensate.

Right now, there is huge demand for LED Virtual Production. But of course, we still have Green and Blue Screen VP, and that isn’t going to disappear! Productions are drawn to LED because talent can interact in real-time with the virtual scene which may improve their performance and reduce takes. LED VP also adds beautiful life-like ‘scene spill’ around the outline of talent and scene reflections on any shiny surfaces, so it lends itself to shooting cars and Mandalorian styled characters.

Lighting is an important consideration with an LED VP shoot for two reasons. The first is that a LED volume requires external lighting to supplement the LED volume light in order to render correct skin tones. The second is that the virtual lights illuminating the virtual scene displayed in the LED volume, need to match the real lights on stage in order to blend together the two environments to create one seamless VFX shot.

You’re also going to have to think about the other skill sets you need around you to make all of this come together. It’s essential to have a VP Technician who understands camera tracking and the LED environment, along with a specialist in Unreal Engine to control your virtual scene, and to make scene element changes as requested by the Director.

What skills does a producer need for virtual production?

A producer’s skills are entirely transferable. It’s just about adapting your experience and skills to suit the virtual production workflow. And the greater understanding you have of the workflow, the better you’ll be able to utilise the technology.

Experience and training in virtual production is highly beneficial. At Mo-Sys Academy, for example, we run industry recognised Virtual Production training courses to help people understand how virtual productions come together and how to make the most of them.

In what cases would a virtual production work particularly well?

Virtual production is a really powerful additional tool that producers can utilise. It’s not a case of it taking over completely from traditional production but I think that a combination of real-world traditional production and virtual production combined can be a really smart way to maximise production value, drive efficiencies and reduce costs.

For example, you might not use virtual production for things like establishing shots (which are typically much wider shots to set the scene,) because you’d need such a huge studio. Instead, you might opt to shoot those in the real world, and then shoot the rest of your production, mid and close shots, with the talent on a virtual production set.

Virtual production works really well for VFX shots that would traditionally require the crew and cast to travel to exotic locations, or where multiple scenes are required which in a virtual production stage can be swapped very quickly.

Virtual production is a term that describes multiple shooting techniques that combine either tracked virtual graphics or plate video playback with a real studio or location, and ultimately, it’s about selecting the right technique for the right scene. Once understood, and with experienced team members to assist, virtual production offers Producers a whole new arsenal for delivering highly efficient VFX shoots.

Read the full article published by Little Black Book August 2, 2022

‘I love the industry’s appetite for change’

Paul Clennell, Chief Technology Officer at dock 10, discusses recent changes in the media tech industry with TVB Europe for their July/ August issue. He also shares his view on the biggest topic of discussion in his area of the industry which has been mainly focusing around Virtual Production.

We’re very focused on the next generation of virtual production, using the latest real-time games-engine technology from Epic and Unreal Engine with Zero Density, together with the Mo-Sys StarTracker system, and encouraging entertainment formats to adopt these exciting technologies more widely over the next few years. There is a continued pressure to reduce budgets, but channels still demand new and exciting content, and this is where virtual studio elements can make a creative difference.

Paul Clennell, Chief Technology Officer, dock 10.

Read the full article here (Pg 34) >

Virtual Production and the importance of having the right tools for the job

“For decades we have been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalize the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process.” Michael Geissler, CEO of Mo-Sys, talks to InBroadcast about the evolution of Virtual Production.

Mo-Sys InBroadcast
CLICK HERE TO READ THE INBROADCAST ARTICLE PG 34.

Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient, but requires a greater level of pre-production preparation, virtual production knowledge and experience, together with the right tools for the job. 

The concept of Virtual Production is not new. Tracked virtual graphics arrived during the mid-1990’s where they were used predominantly for broadcast, albeit with limited tracking, no lens calibration and would typically produce poor virtual image quality.

Thankfully, the story doesn’t end there. In what Mo-Sys have named Virtual Production 1.0 (VP 1.0), circa. 2016 we saw the introduction of games engines capable of producing photo-realistic onset pre-viz. The next logical step would have been real-time VFX in green/blue screen studios, but we jumped straight to VP 2.0, where production of The Mandalorian illustrated the exciting new capabilities of real-time VFX using LED volumes.

Hype, coupled with a surge in demand from the locked-down live events industry, who cleverly pivoted into virtual production for survival, meant that many early LED VP stages were using lower quality live event LED tiles, live event-oriented media servers and mo-cap tracking systems, rather than cinematic LED tiles, dedicated cinematic content servers, and precision camera tracking systems such as Mo-Sys StarTracker.

Right now, virtual production is in its third stage of evolution – VP 3.0. It has matured over the last five years and in 2020, the global market was reported to be worth in excess of £2 billion and is expected to grow beyond £5 billion by 2028. Mo-Sys’ experience gained from 25 years within the broadcast and film industry, together with bleeding edge innovation is driving VP from what it was in early 2020, to what it needs to be in order to deliver on many of the benefits originally promised.

Within broadcast and film, there is now a greater understanding surrounding the limitations of live event LED tiles and events media servers which were developed for entertainment applications such as projection mapping and timeline lighting shows.

Cinematographers and VFX Supervisors are demanding a dedicated, focused toolset that meets their needs and enables a closer content quality match to traditional post-production compositing quality.

Mo-Sys VP Pro XR

Mo-Sys is delivering exactly that with its award-winning VP Pro XR. Designed by in-house cinematic innovators for XR creatives, Mo-Sys VP Pro XR is an award-winning, dedicated XR server solution that takes a radical new approach to delivering cinematic standards to on-set, real-time virtual production using LED volumes. Designed for LED stages with or without set extensions, it can also be used with blue/green screens and enables traditional shooting techniques within an LED volume, with a focus on composite image quality.

Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. A pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions, Mo-Sys has made it even easier to produce seamless, high-end productions with VP Pro XR. Designed to enable the use of traditional shooting techniques within virtual productions, it removes the limitations on the ability to tell stories that are imposed by current XR stage designs.

Multi-Cam
Mo-Sys Multi-Cam Switching

Another major hurdle in VP and the use of LED volume comes when switching between multiple cameras pointing at an LED volume. Whilst the camera outputs can be switched in 1 frame, the virtual scene displayed on the LED volume typically take 5-6 frames to update. This means on every camera switch there will be 5-6 frames of the previous camera’s virtual scene displayed on the LED volume, before it updates with the correct perspective view for the new camera. As a result, you get a background flash on every camera switch, which is unusable in production.

The latest iteration of VP Pro XR addresses this and orchestrates the delay between the operator switching cameras, and the LED volume updating with the correct perspective view of the live camera. More importantly, it will do this at up to the full UHD4K resolution of the LED processor input, whereas previous workarounds would reduce the resolution of the content to HD to achieve the same outcome.

Mo-Sys Cinematic XR Focus

Along with full resolution multi-cam switching, Mo-Sys Cinematic XR Focus enables seamless interaction between virtual and real worlds. This feature ensures that an LED wall can be used as more than just a backdrop, allowing it to integrate with the real stage. This gives cinematographers the means to seamlessly rack focus deep into the virtual world and create immersive shots which enhance their visual storytelling.

NearTime® is Mo-Sys’ HPA Engineering award-winning solution for solving real-time VFX virtual production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, when combined deliver higher quality real-time VFX content.

Read the full article here (Pg 34) >

One Giant Leap

In the latest issue of the DEFINITION magazine, Chelsea Fearnley discusses how SMPTE is driving the future of virtual production by bringing together the best minds in the industry to explore the most exciting advancements in media and entertainment tech.

Mo-Sys Definition Magazine
CLICK HERE TO READ THE FULL DEFINITION ARTICLE PG 25.

The hybrid three-day event showcasing in-camera VFX and Virtual Production hosted by SMPTE and sponsored by Mo-Sys, gave viewers an interactive experience in the world of the virtual set.

A pop-up LED volume was constructed by the team behind Virtual Production Studios at 80six, and comprised cutting-edge products from 80six’s inventory, including two Roe Visual Diamond LED screens with processing by Brompton Technology, Unreal Engine, the Mo-Sys StarTracker system, Disguise’s VX content servers and ETC lighting control console. Together, they enabled a smooth virtual production workflow.

Chelsea Fearnley, DEFINITION MAGAZINE.

Read the full article here (Pgs 25-27) >

Rob Fowler, director of business development at Brompton Technology, spoke to DEFINITION magazine about the success of the SMPTE On-set Virtual Production event in a follow-up article on pgs 28-29 of the same edition.

We already support Mo-Sys, which provided the tracking system at the SMPTE event. This means that instead of using physical markers (usually dots on floors or ceilings) to locate the tracking camera, we can embed those markers into content on the walls to be visible to the tracking camera, but invisible to the cinema camera.

Rob Fowler, Director of business development at Brompton Technology

Find out more about Brompton’s new digital tracking markers for LED volumes here >

VP Pro XR making waves in VP world

Mo-Sys’ VP Pro XR system has been turning heads since it was launched, snapping up awards and earning endorsements from some well-known names in the industry.

Pocket Films met up with Mike Grieve, Mo-Sys Commercial Director, at the SMPTE UK visits… an On-set Virtual Production event to hear more about the technology’s success to date.

Mike Grieve Studio Map
Mike Grieve being filmed at Garden Studios

Tell us a little bit about VP Pro XR and the challenges that this technology is designed to meet.

VP Pro XR, which won a Production Hub Award this year and picked up the NAB Product of the Year 2021 accolade, supports XR production in LED volumes for real-time VFX film and TV production. We designed the technology with the aim of addressing the restrictions and compromises that producers currently encounter with XR stages.

Although XR volumes deliver cost savings by minimizing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that some find is not yet comparable to non-real-time compositing. This is the challenge that motivated the development of VP Pro XR.

VP Pro XR is an LED content server solution, purpose-designed to offer Cinematic XR capability, with a focus on composite image quality whilst offering unique cinematic features.

VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative, can you tell us more about this?

The aim of our Cinematic XR initiative is to move final pixel XR production forwards, in terms of image quality and shooting creativity, from its initial roots using live event LED equipment to purpose-built Cinematic XR.

Cinematic XR is focused on improving image fidelity, and introducing established cinematic shooting techniques to XR. It promotes the interaction between virtual and real set elements, and the development of new hybrid workflows combining virtual and non-real-time compositing. These capabilities are being designed into the VP Pro XR platform.

First among these capabilities is Cinematic XR Focus, which has been recognised in the Cine Gear 2021 Technical Awards for its innovative technology.

Traditionally, the LED wall has been a passive part of the wider set, but now for the first time, we are enabling an LED wall to be used as more than just a backdrop. Instead, cinematographers can integrate LED wall visuals with the real stage and seamlessly rack focus deep into the virtual world to create layered images that enhance their visual storytelling.

Cinematic XR Focus synchronises a physical focus pull with a virtual focus pull, managing the transition at the LED wall so that the complete focus plane movement is seamless. It requires no additional action on behalf of the Focus Puller, and enables them to use a standard wireless lens controller – currently Preston but with other options coming soon.

NearTime is another pillar of Cinematic XR and was awarded the 2021 HPA Awards for Engineering Excellence by the Hollywood Professional Association (HPA). This feature solves one of the key challenges of LED ICVFX shoots, balancing Unreal image quality whilst maintaining real-time frame rates. Currently, every Unreal scene created for ICVFX has to be reduced in quality in order to guarantee real-time playback. Combining NearTime with an LED ‘smart green’ frustum, the same Unreal scene can be automatically re-rendered with higher quality or resolution, and this version can replace the original background Unreal scene.

NearTime operates in parallel to an ICVFX shoot and starts at the same time. It finishes after the real-time shoot but still within the real-time delivery window, meaning that a higher quality result can be delivered in the same time, with minimal increase in cost. No traditional post-production costs or time are required, plus moiré issues completely avoided.

How would you describe the industry’s reaction to VP Pro XR?

We have had a phenomenal response from customers and partners to VP Pro XR. ARRI Rental UK was the first to install and use this innovative technology at its new mixed reality studio in Uxbridge, west of London along with our StarTracker system. This is now one of the largest permanent LED volume studios in Europe and we are so proud that Mo-Sys technology lies at the heart of it. Our technology is designed specifically for real-time final pixel shooting, to deliver cinematic quality imagery, with clever unique features and technology geared to Cinematographers.

VP Pro XR has also been deployed by London’s Garden Studios for its state-of-the-art virtual production (VP) stage installed in March 2021. Garden Studios, which already has a Mo-Sys StarTracker system and uses the VP Pro software for its main LED volume, now benefits from set extension (XR) capability, offering clients additional creative capability. The facility recently added a second StarTracker camera and lens tracking system to support its ongoing R&D and expansion.

We have also picked up a number of prestigious industry awards for VP Pro XR as well as for new features like Cinematic XR Focus and we could not be prouder. The VP sector is in an extremely exciting stage in its evolution, and we are thrilled to be at the heart of this new direction.

This article was first published on The Studio Map website.

Cinematic broadcasting

In the latest DEFINITION issue, Phil Rhodes explores the crossover between TV and film caused by the current push for more capability from broadcast cameras. The combination of the Mo-Sys StarTracker and Unreal Engine plug-in were used on Strictly Come Dancing to create AR elements for a studio emptied of its audience due to the pandemic.

“Covid-19 accelerated implementation of new broadcast technology, with Strictly Come Dancing feeling the Mo-Sys magic”.

Phil Rhodes, DEFINITION magazine.
Click here to read the full DEFINITION article pg 49.

Read more about Mo-Sys and Strictly Come Dancing >

AV and Broadcast – is there any difference?

AV Magazine explores the rapid cross-pollination of tech and workflows between AV and Broadcast industries, mainly pushed by the global pandemic. Traditionally, the workflow and technologies used for broadcast and AV were viewed as distinct markets. Recently, these elements have become part of the same environment.

“Over time, the gear has become a lot less expensive and it means a company like M&S can afford to buy the same kit as a local ITV channel”.

Mike Grieve, Mo-Sys Commercial Director
Read the full AV article here

Mo-Sys G30 gyro-stabilized head

The latest InBroadcast January issue featured an article on the launch of the ground-breaking Mo-Sys G30 gyro-stabilized head which was announced earlier this month.

The unique design has been refined using extensive real-world experience, resulting in a stabilized head that has the performance of much more expensive systems, but with the usability and ease of setup of much simpler gimbal devices.

“In conversation with early adopters, the feedback was for simpler operation and faster setup. In response, we modified elements of the design, such as the frame size, but we also took the opportunity to suggest some smart technologies we were developing. These were well received and so were also implemented into the G30 design”.

Michael Geissler, CEO of Mo-Sys.
Click here to read the full InBroadcast article, pg 42.

Robotic Camera Systems

The Mo-Sys StarTracker Studio is an award-winning state-of-the-art virtual studio system which provides unlimited creative freedom to generate high-impact digital studios for every video type.

For this InBroadcast January edition, contributing editor David Kirk explores the features of our StarTracker Studio which is a pre-configured and pre-calibrated, all-in-one, 4K studio that operates straight out of the box, enabling users to focus on the creative and story-telling aspects.

Using StarTracker Studio, virtual sets can be switched almost instantly and augmented to include remote guests or superimposed objects. Photo-realistic capabilities include occlusion handling which allows a presenter to walk in front of or behind a virtual object, and reflection mapping which intelligently adds reflections to adjacent objects.

David Kirk, InBroadcast
Read the full InBroadcast article here, pg30.

Virtual Production Takes a Leap Forward

In the latest issue of TVBEurope, Kevin Hilton talks to leading technology providers about how programme makers are using the new techniques available to them. Mo-Sys’ involvement in the BBC Sports Olympics studio was featured as an example of the use of robotic cameras, another component in virtual production.

Michael Geissler, founder and owner of Mo-Sys also discusses how today, with 18 months of on-set, real-time LED/XR production experience under our collective belts, the broadcast and film industries are quickly realising the need for ‘fit for purpose’ LED/XR technology.

In designing VP Pro XR and launching its Cinematic XR initaitve, set out to address the cinematic quality and functionality requirements of LED/XR for on-set, real-time production. The result is a product that is specifically designed to address the current on-set real-time quality and functionality challenges, provide a platform for unique features (e.g. Cinematic XR Focus), and to address known XR challenges, such as investment costs, expansion costs, and XR graphics delay.

– Michael Geisller, Mo-Sys Engineering
TVBEurope November/December 2021 issue

Read the full article here.