PLAZAMEDIA selects Mo-Sys as its primary XR solutions provider

 Mo-Sys Engineering today announces that its long-standing relationship with PLAZAMEDIA GmbH, a subsidiary of Sport1 Medien AG and an established content solution provider for the entire spectrum of media platforms and one of the leading producers of sports TV for German-speaking audiences, has been extended. The content solutions provider, which aims to raise the bar for Virtual Production with its new LED Studio, has chosen Mo-Sys as the primary XR solution provider for the implementation of its LED initiatives.

At the heart of PLAZAMEDIA’s decision is the unmatched capability and functionality of Mo-Sys’ technology and features such as multi-camera switching, Cinematic XR Focus and its latest innovation, the NearTime® on-set re-rendering workflow for ICVFX.

Jens Friedrichs, Chairman of the Management Board of PLAZAMEDIA GmbH commented: “Mo-Sys delivers all the performance we need to create a leading-edge LED volume studio. They understand the importance of delivering cinematic quality from end-to-end even for broadcast applications and productions for corporate clients – especially with regard to our clear focus on sustainable green production. This combined with their innovative toolset and the collaborative approach of the Mo-Sys team made them the unbeatable choice of partner for us.”

The flagship Mo-Sys StarTracker precision camera/lens tracking system is now the technology of choice for leading-edge Virtual Productions. The advanced tools in Mo-Sys’ VP Pro XR content server include set extensions, color-grading, and the unique ability to pull focus seamlessly between real and virtual objects made possible by the unique Cinematic XR Focus feature.

NearTime® is Mo-Sys’ patent-pending and award-winning solution for solving real-time VFX Virtual Production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, and delivers higher quality real-time VFX content. NearTime® also removes Moiré patterning completely and enables the use of lower cost LED panels to deliver an image quality that is far closer to post-production compositing.

“As pioneers in Virtual Production technology, we are driven by our passion to create tools that help our customers create immersive and engaging content without limiting their creativity,” said Philippe Vignal, Mo-Sys Director of Sales and Business Development, EMEA & APAC. “We are extremely proud that an innovator like PLAZAMEDIA has chosen to place Mo-Sys technology at the heart of its new Virtual Production capabilities.”

Mo-Sys and Ideal Systems Group partner for pan-Asian excellence

Mo-Sys Engineering is working with Ideal Systems to drive forward the adoption of virtual and augmented reality production in broadcast across Asia and the Middle East.

Ideal Systems has been a leader in media solutions for more than 30 years. From its head office in Hong Kong, it brings a wealth of experience together with a huge presence in 10 countries and a proven reach across the whole region. It provides consultancy, design, integration, installation and continuing support for many major names in broadcasting, media production and technology.

Ideal Systems

Mo-Sys has driven forward the emerging technologies of virtual and augmented production for 25 years. Pioneers in camera tracking with its unrivalled StarTracker system, and in camera robotics, the company can now deliver turnkey systems through its VP Pro graphics systems. Mo-Sys has been instrumental in the extensive use of realtime augmented reality and virtual studios in broadcast, and in the growing use of LED walls and LED volumes for movie production and for live events.

“The production of outstanding content is a global business,” said Philippe Vignal, director of sales and business development for Mo-Sys. “It is vital for us that we work with the best possible sales partners, to ensure our ground-breaking innovations are seen by all the key players. Ideal Systems is the perfect partner for us, as they have the broad reach across broadcast throughout EMEA and Asia.”

Jim Butler CEO of Ideal Systems added “Mo-Sys is a very impressive company. Its camera tracking and virtual production technologies are recognised by the biggest names out there, and they have very interesting cutting-edge developments in image robotics and remote production. The whole team is excited to be working with Mo-Sys and to be able to offer our customers new revolutionary solutions.”

Broadcasters and content producers can visit Mo-Sys at CABSAT (17 – 19 May, Dubai World Trade Centre, stand E6-12B). Mo-Sys will have a rich demonstration of its latest VP technologies, featuring StarTracker, its high precision optical camera tracking system, and VP Pro XR, its cinematic content server solution for LED production, along with the Mo-Sys VP Pro integrated augmented reality production system on the Ideal Systems stand A5-1.

Virtual Production and the importance of having the right tools for the job

“For decades we have been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalize the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process.” Michael Geissler, CEO of Mo-Sys, talks to InBroadcast about the evolution of Virtual Production.

Mo-Sys InBroadcast

Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient, but requires a greater level of pre-production preparation, virtual production knowledge and experience, together with the right tools for the job. 

The concept of Virtual Production is not new. Tracked virtual graphics arrived during the mid-1990’s where they were used predominantly for broadcast, albeit with limited tracking, no lens calibration and would typically produce poor virtual image quality.

Thankfully, the story doesn’t end there. In what Mo-Sys have named Virtual Production 1.0 (VP 1.0), circa. 2016 we saw the introduction of games engines capable of producing photo-realistic onset pre-viz. The next logical step would have been real-time VFX in green/blue screen studios, but we jumped straight to VP 2.0, where production of The Mandalorian illustrated the exciting new capabilities of real-time VFX using LED volumes.

Hype, coupled with a surge in demand from the locked-down live events industry, who cleverly pivoted into virtual production for survival, meant that many early LED VP stages were using lower quality live event LED tiles, live event-oriented media servers and mo-cap tracking systems, rather than cinematic LED tiles, dedicated cinematic content servers, and precision camera tracking systems such as Mo-Sys StarTracker.

Right now, virtual production is in its third stage of evolution – VP 3.0. It has matured over the last five years and in 2020, the global market was reported to be worth in excess of £2 billion and is expected to grow beyond £5 billion by 2028. Mo-Sys’ experience gained from 25 years within the broadcast and film industry, together with bleeding edge innovation is driving VP from what it was in early 2020, to what it needs to be in order to deliver on many of the benefits originally promised.

Within broadcast and film, there is now a greater understanding surrounding the limitations of live event LED tiles and events media servers which were developed for entertainment applications such as projection mapping and timeline lighting shows.

Cinematographers and VFX Supervisors are demanding a dedicated, focused toolset that meets their needs and enables a closer content quality match to traditional post-production compositing quality.

Mo-Sys VP Pro XR

Mo-Sys is delivering exactly that with its award-winning VP Pro XR. Designed by in-house cinematic innovators for XR creatives, Mo-Sys VP Pro XR is an award-winning, dedicated XR server solution that takes a radical new approach to delivering cinematic standards to on-set, real-time virtual production using LED volumes. Designed for LED stages with or without set extensions, it can also be used with blue/green screens and enables traditional shooting techniques within an LED volume, with a focus on composite image quality.

Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. A pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions, Mo-Sys has made it even easier to produce seamless, high-end productions with VP Pro XR. Designed to enable the use of traditional shooting techniques within virtual productions, it removes the limitations on the ability to tell stories that are imposed by current XR stage designs.

Mo-Sys Multi-Cam Switching

Another major hurdle in VP and the use of LED volume comes when switching between multiple cameras pointing at an LED volume. Whilst the camera outputs can be switched in 1 frame, the virtual scene displayed on the LED volume typically take 5-6 frames to update. This means on every camera switch there will be 5-6 frames of the previous camera’s virtual scene displayed on the LED volume, before it updates with the correct perspective view for the new camera. As a result, you get a background flash on every camera switch, which is unusable in production.

The latest iteration of VP Pro XR addresses this and orchestrates the delay between the operator switching cameras, and the LED volume updating with the correct perspective view of the live camera. More importantly, it will do this at up to the full UHD4K resolution of the LED processor input, whereas previous workarounds would reduce the resolution of the content to HD to achieve the same outcome.

Mo-Sys Cinematic XR Focus

Along with full resolution multi-cam switching, Mo-Sys Cinematic XR Focus enables seamless interaction between virtual and real worlds. This feature ensures that an LED wall can be used as more than just a backdrop, allowing it to integrate with the real stage. This gives cinematographers the means to seamlessly rack focus deep into the virtual world and create immersive shots which enhance their visual storytelling.

NearTime® is Mo-Sys’ HPA Engineering award-winning solution for solving real-time VFX virtual production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, when combined deliver higher quality real-time VFX content.

Read the full article here (Pg 34) >

NAB Show Daily – Advances in Production

Over the last 18 months, the boom in virtual production has shaken up the way filmmakers, broadcasters, mobile video designers and more conduct their creative business. Those technologies enabling virtual production include augmented reality (AR), virtual reality (VR), extended reality (XR) and mixed reality (MR). Susan Ashworth of TV Tech reports for the NAB Show Daily Day1 edition.

Green Screen Comes to the Lecture Hall

University of the Netherlands

Virtual reality is enlivening the lecture hall. The University of the Netherlands worked with Zero Density to create a massive virtual lecture hall using the company’s Reality Engine real-time broadcast compositing system.

Using the virtual space, scientists and professors have access to interactive storytelling techniques and advanced visualization methods — complete with real-time realistic reflections and refractions of physical objects and people inside the green screen. Professors are able to explain complex matters on a large virtual screen or run physical experiments with immersive real-time graphics.

The space employs Grass Valley LDX 86N 4K cameras alongside Mo-Sys camera tracking technology while the green screen lecture space itself runs on two Reality Engines.

London Welcomes Virtual Production

Garden Studios, London

When Garden Studios in London finalized its virtual production stage in 2021, the result was a 4,800-square-foot space that could serve as a cost-effective filming option with unique creative opportunities. The new virtual production studio at Garden Studios allows filmmakers to shoot real-time virtual effects on set by using virtual graphics displayed on an LED volume (an enclosed space where motion capture and compositing can take place) to create photo-realistic backdrops.

Among the technologies in place are the VP Pro XR server and StarTracker camera and lens tracking systems from Mo-Sys Engineering. Features within the VP Pro XR are an Unreal Engine editor interface and feature known as Cinematic XR Focus that allows a filmmaker to pull focus between talent or objects in the physical studio and virtual objects positioned in the LED volume.

According to the company, this means that an LED volume can be used as more than just a backdrop but rather can deliver better interactivity between real and virtual elements. Jillian Sanders, virtual production coordinator for Garden Studios, said the Mo-Sys team came to the studio for a multitude of tests including displaying digital tracking markers on the studio’s LED ceiling.

“We’ve also been able to assist with testing and development of their new VP Pro XR,” she said. “This exciting new tool allows for features such as digital set extensions, the ability to focus past the LED wall into the digital world, and near time rendering.”

Read the full article here >

Mo-Sys joins AOTO at NAB to showcase Broadcast LED VP Innovation

Mo-Sys Engineering will join forces with AOTO Electronics Co. Ltd., a specialist in LED application products, to showcase LED Virtual Production solutions aimed at broadcast news and factual programming at the NAB Show 2022 on the AOTO booth C3331.  

AOTO LED volume in action

Mo-Sys and partners are demonstrating:

  • XR set extensions and AR (Augmented Reality) for a virtual world beyond the boundary of the LED wall
  • Multi-Cam XR for live broadcast with clean, full-resolution camera/frustum switching
  • Data-driven graphics and MOS integration in native Unreal Engine

In a live LED Virtual Production demonstration, Mo-Sys will showcase its LED content server VP Pro XR and its precision camera tracking system StarTracker, working with AOTO’s 2.3 pitch LED tiles. The demonstration will show how set extensions can be used to expand the studio space, and how the newly released Multi-Cam Switching feature can be used to seamlessly switch between live cameras at resolutions up to UHD4K, and without the LED wall delay appearing in shot. 


“Creating outstanding content is how broadcasters can differentiate themselves in today’s fiercely competitive marketplace,” said Mo-Sys CEO Michael Geissler. “Bringing cinematic quality Virtual Production techniques into the broadcast environment gives customers the ability to innovate in a highly cost-efficient way and our collaboration with AOTO provides a clear demonstration of how our pioneering technology innovation and their unmatched cinematic quality LED tiles can deliver incredible results for broadcasters.”   

As part of the demonstration, and in a ground-breaking move, Mo-Sys will also show its integration with Erizos Studio, enabling Unreal Engine graphics to be used not just for the virtual studios, but for traditional on-screen graphics, such as lower thirds and data-driven graphics, either embedded in the Unreal scene or as AR objects. Erizos Studio provides the complete broadcast workflow, including industry standard newsroom computer system (NRCS) MOS integration. This development means that broadcasters can now use a single graphics platform to deliver all their graphics and utilize high quality LED wall background instead of a green screen to eliminate green spill issues completely. 

Michael Huang, Senior Account Manager, AOTO Electronics Co. Ltd., commented: “Mo-Sys’ Virtual Production innovation is well known, but this technology is ground-breaking enabling LED Virtual Production to be used across broadcast whilst still complying with tried and trusted workflows.”

Mo-Sys and VFX World partner to offer virtual production solutions

Mo-Sys Engineering has formed a powerful new partnership with VFX World, a leader in on-set VFX solutions for cinematic productions.

LED Volume from VFX World

Mo-Sys brings to the partnership a team of pioneering innovators with over 25 years’ experience of developing virtual production technology, and who today are at the forefront of LED volume innovation. VFX World brings extensive on-set VFX experience from major cinematic productions, blending traditional green/blue screen know-how with the latest in LED volume innovation. Together the two companies present a powerful combination of expertise and knowledge.

At the BSC Expo (7 – 9 April, Battersea Evolution London, stand 158) on the VFX World stand, both companies will show the core components of LED virtual production, and will detail new developments specifically aimed at cinematic LED virtual production.

The stand will feature a ROE Diamond 2.6 LED wall, Brompton SX40 processor, with Mo-Sys providing camera tracking using its widely recognised StarTracker system, and real-time virtual graphics using the Mo-Sys VP Pro XR LED content server.

“Combining experienced VFX onset crews with products from the leading innovator in virtual production technology today, represents a powerful resource for productions shooting VFX heavy content,” said Jem Morton, Director of VFX World.

“We are very excited to be partnering with movie specialist VFX World,” said Michael Geissler, CEO of Mo-Sys. “Together we are far ahead of other offerings in this field, with unique features offering Cinematographers greater creative freedom and improved virtual production imagery.”

The first generation of LED virtual production solutions utilized primarily live events technology, because that was what was available at the time. Mo-Sys’ VP Pro XR LED content server and StarTracker camera tracking system, along with higher quality LED tiles and LED processors, represent a technically superior approach to cinematic LED virtual production, providing increased fidelity and smarter workflows.

At BSC Expo, VFX World will be detailing two Mo-Sys patented technologies. The first is Cinematic XR Focus, a method of pulling focus from talent to virtual objects positioned behind the plane of the LED volume. The second is a unique solution for increasing the composite image quality from a real-time VFX production.

As well as the joint presentation on stand 158, Mo-Sys will also have its own stand, E15, where it will focus on its new G30 heavy-duty gyro-stabilized remote head. Engineered for high quality broadcast and movie work, it features an ultra-stiff frame and oversized high torque motors for precision movement and superior image stabilization.

For virtual production applications the G30 also includes built-in positional encoders, making it fast to set up, intuitive to operate and precise in tracking and stabilization.

Register for BSC Expo 2022 (7-9 April 2022) here >

Mo-Sys Innovation Solves Multi-Camera Switching For LED Volumes at NAB Show 2022

Mo-Sys Engineering today announced that its award-winning cinematic content server for LED volumes, VP Pro XR has been further improved to incorporate seamless multi-camera switching. The upgrade fully orchestrates multi-camera switching to overcome the typical 5-6 frame refresh delay experienced by all LED volumes when switching between cameras.

Mo-Sys VP Pro XR Multi-Cam
New VP Pro XR multi-camera switching feature to overcome the typical 5-6 frame refresh delay in LED volumes.

The company introduced VP Pro XR, the industry’s first cinematic LED content server solution, ahead of NAB 2021 where it picked up the prestigious Product of The Year award. Designed specifically for use with LED volumes, with or without set extensions, VP Pro XR delivers cinematic standards for LED virtual production.

“Switching between multiple cameras pointing at an LED volume presents a challenge” explained Michael Geissler, CEO of Mo-Sys. “While the camera outputs can be switched in 1 frame, the virtual scene displayed on the LED volume typically take 5-6 frames to update. This means on every camera switch there will be 5-6 frames of the previous camera’s virtual scene displayed on the LED volume, before it updates with the correct perspective view for the new camera. Effectively you get a background flash on every camera switch, which is unusable in production.”

With a new version of software for VP Pro XR, and the addition of a simple Black Magic Design video switcher, VP Pro XR will now orchestrate the delay between the operator switching between cameras, and the LED volume updating with the correct perspective view of the live camera. Importantly, it will do this at up to the full UHD4K resolution of the LED processor input, whereas previous workarounds would reduce the resolution of the content to HD to achieve the same outcome.

“Full resolution multi-cam switching is yet another unique feature of VP Pro XR joining Cinematic XR Focus for pulling focus between real and virtual elements, and NearTime for solving real-time VFX graphics quality,” concluded Geissler. “This follows on from the Cinematic XR objectives originally outlined when we launched our LED content server system last year.”

Mo-Sys Multi-Cam switching for VP Pro XR is available now.

For more information contact the Mo-Sys sales team on Tel: +44 208 858 3205 (EMEA & APAC) or +1 424 374 4011 (The Americas) or email

See the Mo-Sys Multi-Cam in action in this video: