Virtual Production and the importance of having the right tools for the job

“For decades we have been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalize the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process.” Michael Geissler, CEO of Mo-Sys, talks to InBroadcast about the evolution of Virtual Production.

Mo-Sys InBroadcast
CLICK HERE TO READ THE INBROADCAST ARTICLE PG 34.

Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient, but requires a greater level of pre-production preparation, virtual production knowledge and experience, together with the right tools for the job. 

The concept of Virtual Production is not new. Tracked virtual graphics arrived during the mid-1990’s where they were used predominantly for broadcast, albeit with limited tracking, no lens calibration and would typically produce poor virtual image quality.

Thankfully, the story doesn’t end there. In what Mo-Sys have named Virtual Production 1.0 (VP 1.0), circa. 2016 we saw the introduction of games engines capable of producing photo-realistic onset pre-viz. The next logical step would have been real-time VFX in green/blue screen studios, but we jumped straight to VP 2.0, where production of The Mandalorian illustrated the exciting new capabilities of real-time VFX using LED volumes.

Hype, coupled with a surge in demand from the locked-down live events industry, who cleverly pivoted into virtual production for survival, meant that many early LED VP stages were using lower quality live event LED tiles, live event-oriented media servers and mo-cap tracking systems, rather than cinematic LED tiles, dedicated cinematic content servers, and precision camera tracking systems such as Mo-Sys StarTracker.

Right now, virtual production is in its third stage of evolution – VP 3.0. It has matured over the last five years and in 2020, the global market was reported to be worth in excess of £2 billion and is expected to grow beyond £5 billion by 2028. Mo-Sys’ experience gained from 25 years within the broadcast and film industry, together with bleeding edge innovation is driving VP from what it was in early 2020, to what it needs to be in order to deliver on many of the benefits originally promised.

Within broadcast and film, there is now a greater understanding surrounding the limitations of live event LED tiles and events media servers which were developed for entertainment applications such as projection mapping and timeline lighting shows.

Cinematographers and VFX Supervisors are demanding a dedicated, focused toolset that meets their needs and enables a closer content quality match to traditional post-production compositing quality.

Mo-Sys VP Pro XR

Mo-Sys is delivering exactly that with its award-winning VP Pro XR. Designed by in-house cinematic innovators for XR creatives, Mo-Sys VP Pro XR is an award-winning, dedicated XR server solution that takes a radical new approach to delivering cinematic standards to on-set, real-time virtual production using LED volumes. Designed for LED stages with or without set extensions, it can also be used with blue/green screens and enables traditional shooting techniques within an LED volume, with a focus on composite image quality.

Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. A pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions, Mo-Sys has made it even easier to produce seamless, high-end productions with VP Pro XR. Designed to enable the use of traditional shooting techniques within virtual productions, it removes the limitations on the ability to tell stories that are imposed by current XR stage designs.

Multi-Cam
Mo-Sys Multi-Cam Switching

Another major hurdle in VP and the use of LED volume comes when switching between multiple cameras pointing at an LED volume. Whilst the camera outputs can be switched in 1 frame, the virtual scene displayed on the LED volume typically take 5-6 frames to update. This means on every camera switch there will be 5-6 frames of the previous camera’s virtual scene displayed on the LED volume, before it updates with the correct perspective view for the new camera. As a result, you get a background flash on every camera switch, which is unusable in production.

The latest iteration of VP Pro XR addresses this and orchestrates the delay between the operator switching cameras, and the LED volume updating with the correct perspective view of the live camera. More importantly, it will do this at up to the full UHD4K resolution of the LED processor input, whereas previous workarounds would reduce the resolution of the content to HD to achieve the same outcome.

Mo-Sys Cinematic XR Focus

Along with full resolution multi-cam switching, Mo-Sys Cinematic XR Focus enables seamless interaction between virtual and real worlds. This feature ensures that an LED wall can be used as more than just a backdrop, allowing it to integrate with the real stage. This gives cinematographers the means to seamlessly rack focus deep into the virtual world and create immersive shots which enhance their visual storytelling.

NearTime® is Mo-Sys’ HPA Engineering award-winning solution for solving real-time VFX virtual production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, when combined deliver higher quality real-time VFX content.

Read the full article here (Pg 34) >

One Giant Leap

In the latest issue of the DEFINITION magazine, Chelsea Fearnley discusses how SMPTE is driving the future of virtual production by bringing together the best minds in the industry to explore the most exciting advancements in media and entertainment tech.

Mo-Sys Definition Magazine
CLICK HERE TO READ THE FULL DEFINITION ARTICLE PG 25.

The hybrid three-day event showcasing in-camera VFX and Virtual Production hosted by SMPTE and sponsored by Mo-Sys, gave viewers an interactive experience in the world of the virtual set.

A pop-up LED volume was constructed by the team behind Virtual Production Studios at 80six, and comprised cutting-edge products from 80six’s inventory, including two Roe Visual Diamond LED screens with processing by Brompton Technology, Unreal Engine, the Mo-Sys StarTracker system, Disguise’s VX content servers and ETC lighting control console. Together, they enabled a smooth virtual production workflow.

Chelsea Fearnley, DEFINITION MAGAZINE.

Read the full article here (Pgs 25-27) >

Rob Fowler, director of business development at Brompton Technology, spoke to DEFINITION magazine about the success of the SMPTE On-set Virtual Production event in a follow-up article on pgs 28-29 of the same edition.

We already support Mo-Sys, which provided the tracking system at the SMPTE event. This means that instead of using physical markers (usually dots on floors or ceilings) to locate the tracking camera, we can embed those markers into content on the walls to be visible to the tracking camera, but invisible to the cinema camera.

Rob Fowler, Director of business development at Brompton Technology

Find out more about Brompton’s new digital tracking markers for LED volumes here >

VP Pro XR making waves in VP world

Mo-Sys’ VP Pro XR system has been turning heads since it was launched, snapping up awards and earning endorsements from some well-known names in the industry.

Pocket Films met up with Mike Grieve, Mo-Sys Commercial Director, at the SMPTE UK visits… an On-set Virtual Production event to hear more about the technology’s success to date.

Mike Grieve Studio Map
Mike Grieve being filmed at Garden Studios

Tell us a little bit about VP Pro XR and the challenges that this technology is designed to meet.

VP Pro XR, which won a Production Hub Award this year and picked up the NAB Product of the Year 2021 accolade, supports XR production in LED volumes for real-time VFX film and TV production. We designed the technology with the aim of addressing the restrictions and compromises that producers currently encounter with XR stages.

Although XR volumes deliver cost savings by minimizing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that some find is not yet comparable to non-real-time compositing. This is the challenge that motivated the development of VP Pro XR.

VP Pro XR is an LED content server solution, purpose-designed to offer Cinematic XR capability, with a focus on composite image quality whilst offering unique cinematic features.

VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative, can you tell us more about this?

The aim of our Cinematic XR initiative is to move final pixel XR production forwards, in terms of image quality and shooting creativity, from its initial roots using live event LED equipment to purpose-built Cinematic XR.

Cinematic XR is focused on improving image fidelity, and introducing established cinematic shooting techniques to XR. It promotes the interaction between virtual and real set elements, and the development of new hybrid workflows combining virtual and non-real-time compositing. These capabilities are being designed into the VP Pro XR platform.

First among these capabilities is Cinematic XR Focus, which has been recognised in the Cine Gear 2021 Technical Awards for its innovative technology.

Traditionally, the LED wall has been a passive part of the wider set, but now for the first time, we are enabling an LED wall to be used as more than just a backdrop. Instead, cinematographers can integrate LED wall visuals with the real stage and seamlessly rack focus deep into the virtual world to create layered images that enhance their visual storytelling.

Cinematic XR Focus synchronises a physical focus pull with a virtual focus pull, managing the transition at the LED wall so that the complete focus plane movement is seamless. It requires no additional action on behalf of the Focus Puller, and enables them to use a standard wireless lens controller – currently Preston but with other options coming soon.

NearTime is another pillar of Cinematic XR and was awarded the 2021 HPA Awards for Engineering Excellence by the Hollywood Professional Association (HPA). This feature solves one of the key challenges of LED ICVFX shoots, balancing Unreal image quality whilst maintaining real-time frame rates. Currently, every Unreal scene created for ICVFX has to be reduced in quality in order to guarantee real-time playback. Combining NearTime with an LED ‘smart green’ frustum, the same Unreal scene can be automatically re-rendered with higher quality or resolution, and this version can replace the original background Unreal scene.

NearTime operates in parallel to an ICVFX shoot and starts at the same time. It finishes after the real-time shoot but still within the real-time delivery window, meaning that a higher quality result can be delivered in the same time, with minimal increase in cost. No traditional post-production costs or time are required, plus moiré issues completely avoided.

How would you describe the industry’s reaction to VP Pro XR?

We have had a phenomenal response from customers and partners to VP Pro XR. ARRI Rental UK was the first to install and use this innovative technology at its new mixed reality studio in Uxbridge, west of London along with our StarTracker system. This is now one of the largest permanent LED volume studios in Europe and we are so proud that Mo-Sys technology lies at the heart of it. Our technology is designed specifically for real-time final pixel shooting, to deliver cinematic quality imagery, with clever unique features and technology geared to Cinematographers.

VP Pro XR has also been deployed by London’s Garden Studios for its state-of-the-art virtual production (VP) stage installed in March 2021. Garden Studios, which already has a Mo-Sys StarTracker system and uses the VP Pro software for its main LED volume, now benefits from set extension (XR) capability, offering clients additional creative capability. The facility recently added a second StarTracker camera and lens tracking system to support its ongoing R&D and expansion.

We have also picked up a number of prestigious industry awards for VP Pro XR as well as for new features like Cinematic XR Focus and we could not be prouder. The VP sector is in an extremely exciting stage in its evolution, and we are thrilled to be at the heart of this new direction.

This article was first published on The Studio Map website.

Cinematic broadcasting

In the latest DEFINITION issue, Phil Rhodes explores the crossover between TV and film caused by the current push for more capability from broadcast cameras. The combination of the Mo-Sys StarTracker and Unreal Engine plug-in were used on Strictly Come Dancing to create AR elements for a studio emptied of its audience due to the pandemic.

“Covid-19 accelerated implementation of new broadcast technology, with Strictly Come Dancing feeling the Mo-Sys magic”.

Phil Rhodes, DEFINITION magazine.
Click here to read the full DEFINITION article pg 49.

Read more about Mo-Sys and Strictly Come Dancing >

AV and Broadcast – is there any difference?

AV Magazine explores the rapid cross-pollination of tech and workflows between AV and Broadcast industries, mainly pushed by the global pandemic. Traditionally, the workflow and technologies used for broadcast and AV were viewed as distinct markets. Recently, these elements have become part of the same environment.

“Over time, the gear has become a lot less expensive and it means a company like M&S can afford to buy the same kit as a local ITV channel”.

Mike Grieve, Mo-Sys Commercial Director
Read the full AV article here

Please wait while flipbook is loading. For more related info, FAQs and issues please refer to DearFlip WordPress Flipbook Plugin Help documentation.

Mo-Sys G30 gyro-stabilized head

The latest InBroadcast January issue featured an article on the launch of the ground-breaking Mo-Sys G30 gyro-stabilized head which was announced earlier this month.

The unique design has been refined using extensive real-world experience, resulting in a stabilized head that has the performance of much more expensive systems, but with the usability and ease of setup of much simpler gimbal devices.

“In conversation with early adopters, the feedback was for simpler operation and faster setup. In response, we modified elements of the design, such as the frame size, but we also took the opportunity to suggest some smart technologies we were developing. These were well received and so were also implemented into the G30 design”.

Michael Geissler, CEO of Mo-Sys.
Click here to read the full InBroadcast article, pg 42.

Robotic Camera Systems

The Mo-Sys StarTracker Studio is an award-winning state-of-the-art virtual studio system which provides unlimited creative freedom to generate high-impact digital studios for every video type.

For this InBroadcast January edition, contributing editor David Kirk explores the features of our StarTracker Studio which is a pre-configured and pre-calibrated, all-in-one, 4K studio that operates straight out of the box, enabling users to focus on the creative and story-telling aspects.

Using StarTracker Studio, virtual sets can be switched almost instantly and augmented to include remote guests or superimposed objects. Photo-realistic capabilities include occlusion handling which allows a presenter to walk in front of or behind a virtual object, and reflection mapping which intelligently adds reflections to adjacent objects.

David Kirk, InBroadcast
Read the full InBroadcast article here, pg30.

Virtual Production Takes a Leap Forward

In the latest issue of TVBEurope, Kevin Hilton talks to leading technology providers about how programme makers are using the new techniques available to them. Mo-Sys’ involvement in the BBC Sports Olympics studio was featured as an example of the use of robotic cameras, another component in virtual production.

Michael Geissler, founder and owner of Mo-Sys also discusses how today, with 18 months of on-set, real-time LED/XR production experience under our collective belts, the broadcast and film industries are quickly realising the need for ‘fit for purpose’ LED/XR technology.

In designing VP Pro XR and launching its Cinematic XR initaitve, set out to address the cinematic quality and functionality requirements of LED/XR for on-set, real-time production. The result is a product that is specifically designed to address the current on-set real-time quality and functionality challenges, provide a platform for unique features (e.g. Cinematic XR Focus), and to address known XR challenges, such as investment costs, expansion costs, and XR graphics delay.

– Michael Geisller, Mo-Sys Engineering
TVBEurope November/December 2021 issue

Read the full article here.

Robotic Camera Systems

Contributing Editor for InBroadcast November edition David Kirk gives us an update on some of the developments over the last year within the world of robotic camera systems.

The article features Mo-Sys StarTracker Studio, the world’s first pre-configured complete virtual studio system.  

Mo-Sys StarTracker Studio is specifically designed to provide a complete switched-camera system for users new to virtual production or without skills to create a VP studio solution from scratch.

The virtual production software used in StarTracker Studio is Mo-Sys’ VP Pro, which offers the same capabilities as traditional virtual studio software, but rather than being a layer on top of Unreal Engine, it’s embedded directly into the Unreal Engine’s editor interface.

Click here to read the full InBroadcast article, pg 16-18.

Virtual Sets Come of Age

For the InBroadcast November edition, Adrian Pennington explores how the virtual production phenomenon goes from strength to strength with new tools, greater photorealism, and lower prices driving demand.

Currently the industry is using live events LED technology and needs to transition to fit-for-purpose cinematic LED technology. Mo-Sys’ Cinematic XR initiative is aimed at driving for change in LED wall technology, output image quality, re-introduction of established shooting techniques, and smart workflows.

This is the first product on the market that changes an LED volume from just being an expensive backdrop, into something that can be integrated into the storytelling. It enables for the first time real interaction between the talent and objects in the virtual scene positioned virtually behind the LED wall.

Read the full InBroadcast article here.