Mo-Sys Develops Cinematic XR Production for LED Stages

Mo-Sys Engineering has developed a new multi-node media server system, VP Pro XR,  supporting XR production in LED volumes for final pixel film and TV footage. Aiming to overcome restrictions and compromises that producers encounter on XR stages, the system supports the use of traditional shooting techniques for virtual productions.

Mo-Sys VP Pro XR is a hardware and software system combining multi-node nDisplay architecture, updated Mo-Sys VP Pro real-time compositor/synchronizer software, and a new set of XR tools. nDisplay architecture helps users manage very large-scale immersive displays such as projection domes or arrays of LED screens, where real-time scenes are needed at high resolutions, requiring live, synchronous content to refresh at high frame rates. Designed with a minimal system delay of about 6 to 7 frames, it successfully captures XR sequences that require live action talent to interact with mocap AR avatars.

Purpose-Built Cinematic XR

The XR space until now has been largely driven by live event equipment companies. Although XR volumes help save costs by minimizing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that some find isn’t yet comparable to non-real-time compositing. This challenge motivated the development of VP Pro XR.

VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative. The real aim of Cinematic XR is to move final pixel XR production forwards, in terms of image quality and shooting creativity, from its initial roots using live event LED set-ups to purpose-built Cinematic XR. Mo-Sys says final pixel recording  – creating finished quality imagery in-camera through live, LED wall virtual production – is achievable but needs further development.

VP Pro XR extends the original Mo-Sys VP Pro software, which is a real-time compositor, synchronizer and keyer that integrates directly into the Unreal Engine Editor interface, with tools made to simplify virtual production workflows. It works by recording real-time camera and lens tracking, lens focus and zoom plus camera setting data for ARRI Alexa and Sony Venice cameras from Mo-Sys tracking hardware like StarTracker, to create virtual production content.

Mo-Sys has identified four key components of Cinematic XR. These are to improve image fidelity, and to introduce established cinematic shooting techniques to XR. They also support interaction between virtual and real set elements, and the development of new hybrid workflows combining virtual and non-real-time compositing.

Michael Geissler, Mo-Sys CEO said, “Producers love the final pixel XR concept, but cinematographers are concerned about image quality, colour pipeline, mixed lighting and shooting freedom. We started Cinematic XR in response to this challenge, and VP Pro XR is specifically designed to solve problems that our cinematographer and focus puller colleagues have brought our attention.”

Cinematic XR Focus

Delivering cinematic capabilities and standards for cinematographers and focus pullers, the XR media server system is launching just after the recent Mo-Sys Cinematic XR Focus capability was introduced in April. Cinematic XR Focus allows you to pull focus between real and virtual elements in an LED volume and is now available on VP Pro XR.

Pulling focus is one of the creative limitations of using an LED volume for virtual production. Cinematographers have been prevented from pulling focus between real foreground objects and virtual objects displayed on an LED wall, such as cars and buildings, because the lens focal plane stops at the LED wall, keeping the background out of focus.

Now, focus pullers using Cinematic XR Focus with the same wireless lens control system they are accustomed to, can pull focus from real objects through the LED wall to focus on virtual objects that appear to be positioned behind the LED wall. The reverse focus pull is also possible.

Cinematic XR Focus is an option for Mo-Sys’ VP Pro software working with the Mo-Sys StarTracker camera tracking. Cinematic XR Focus synchronizes the lens controller with the output of the Unreal Engine graphics, and thus depends on StarTracker to constantly track the distance between the camera and the LED wall.

Read the full article on the Digital Media World website here >

Mo-Sys and Virtual Focus

The late, great director Alexander MacKendrick once said that the director’s job is to direct attention. Indeed, the ability of the director and DP to lead the viewer’s eye to the important part of the frame is the very foundation of movie-making.

One of the major tools in the DP’s toolbox is the ability to shift that attention by pulling focus and moving the visual plane to a new part of the picture.

Today’s emerging virtual production technology has highlighted how important such techniques are. DPs moving from traditional film production to the new virtual frontier have discovered certain limitations and innovators have been working to solve those overcome those limitations.

One such problem comes with the use of LED volumes. Photo-realistic graphics on multiple LED panels surrounding the real objects in the scene can look amazingly life-like but until now they have been mere backdrops, not able to become fully enmeshed with the scene in a way a real location can. Camera tracking means you can move around the foreground objects while maintaining perspective and parallax, as showcased recently on The Mandalorian.

But what could not be done, until now, is a focus-pull into the computer graphics. The LED volume was essentially a single fixed plane, so once the focus reaches the plane of the LEDs, there is nowhere to go. Until now.

Mo-Sys Engineering, using some ingenious and complex technology, has solved the problem. Mo-Sys first rose to fame for creating the camera robotics used in ‘Gravity’. They have garnered further industry acclaim for the Mo-Sys StarTracker camera tracking system, which tells the CG computer exactly where the camera is in three-dimensional space. It gained plaudits as it combined lens data with real-time positioning, meaning perspective and parallax can be recreated in such detail.

Mo-Sys has now added the ability to seamlessly pull focus between the real and virtual scene, whilst continually monitoring the camera’s position relative to the LED wall. The feature is called Cinematic XR Focus. Technically it is definitely not trivial: it requires considerable extra processing of the signal. The Mo-Sys VP Pro software, which is fully integrated with the Unreal graphics engine, has to simulate the change of focus in real time.

Most importantly, this solution has been developed with a practical use in mind. Focus pullers instinctively use Preston wireless lens controllers, set up for the specific lens mounted on the camera. The Mo-Sys solution takes the control signals from the Preston and allows it to move both the real and virtual focus plane up to the LED wall, then it seamlessly shifts to controlling just the virtual focus plane in the virtual graphics scene.

DP Brett Danton has tried the system on a test shoot in the UK. “This system opens up the XR image to work like a 3 dimensional scene by adding the last element depth,” he said. “Previously you had parallax but now you can focus through the screen, giving far greater freedom in creativity where the scene is reacting as if shooting on location.”

James Uren, CTO of Mo-Sys, said “There were a lot of problems we needed to solve, like moiré patterning between the grid of the camera sensor and the grid of the LED walls, as well as freeing the DP to move the camera and the focus puller to manipulate the lens, simultaneously, in real time for final-pixel shoots.

“But clever technology is no use if it gets in the way of the creativity,” Uren added. “So we made sure it works the way that people are used to, with the equipment they are comfortable with. The Preston controller reports focus as it happens, whether that is in the lens or in the computer. This software extension sits inside our graphics manipulation software, which in turn sits inside the Unreal graphics engine using nDisplay to drive the LED volume – the set-up used by most in this sector.

“We are giving an important part of the creativity, of the language of production, back to the people who are actively demanding it.”

Read the article on the British Cinematographer website here >

Virtual production opens the door to a world of creativity

From high-end drama to sport and entertainment, virtual production is booming as travel restrictions bite and producers seek a safe environment in which to shoot.

In this article for Broadcast Tech in association with The Media Production & Technology Show, Tim Dams explores the recent surge in interest in Virtual Production techniques.

Both Garden Studios and Dimension Studio, who are featured in this article, mention Mo-Sys camera tracking as an integral part of their virtual production studio facilities.

Read the full article here.

The Dual Illusion of Virtual Production

Technology has reached the sweet spot when it becomes practical and affordable to use virtual studios and augmented reality live, in real time, for the AV market.

Much of what you see in films, on television and in commercials isn’t real, but it looks so convincing you don’t question its authenticity. In the past, visual effects involved specialist teams, months of painstaking processing and masses of computing power.

Today, photorealistic 3D visual effects can be generated in real time, on a standard PC, powered by the same graphics engine behind computer games like Fortnite. This is transformative for the enterprise AV market.

Just as film production companies use real-time photorealistic graphics with an LED wall, enterprises can now use the same technology to create a branded conference, a dealer training session, a product launch video or AGM reports, seamlessly switching between scenes in moments.

Covid 19 has accelerated the pace of change, bringing virtual environments to remote production, boosting creative and technical values even when participants can’t gather in the same location. We now have the need, and the technology, to create a dual illusion:

First, with today’s photorealistic 3D graphics, you can build any virtual location or scene, or even use a digital scan of a real location to create the look you need

Second, rather than have people travel to a studio or location, they can be digitally inserted in real time into the virtual environment, such that they appear to be in the studio or at an event location, interacting with a presenter who is in the studio while talking from home with just a simple kit to ‘beam’ them in.

The traditional view has been that virtual studios are expensive, require experts to design them, an integrator to engineer and deploy them, and specialist people to run them. That is where Mo-Sys comes in, with the unique StarTracker Studio, a complete virtual production system in a mobile rack.

Mo-Sys has long been a leader in virtual production technology for big budget movies, broadcast television, commercials and music videos. We recognised that there would be a moment when technology, creativity and price would all hit the sweet spot to bring it to the enterprise A/V market. Our vision was that this would be 2020, and we were prepared. What we had not predicted was that covid-19 would focus everyone’s attention.

Early in 2020 we took our expertise and experience from film and television and created the first complete virtual production studio system designed specifically for the corporate and enterprise market. We’ve taken care of everything: we’ve created a system that can do every kind of virtual production, using all screen technologies, simplified it, and reduced the equipment needed.

Now you have the power to create any photorealistic location or event, using remote designers and minimal studio staff. They have the same tools and technologies the film and broadcast companies have and can create the same illusions – and bring remote guests to the studio virtually. You can create your corporate look, as traditional or as creative as you want, while still navigating safely around covid restrictions.

The dual illusion of virtual production

Read the full article here >

TVBEurope’s 2021 Watch List

Great to see Mo-Sys CEO, Michael Geissler, and our StarTracker camera tracking system featuring on the impressive TVBEurope 2021 watch list!

Mo-Sys is a vastly impressive company, whose StarTracker in-studio optical camera tracking systems for AR and VR we’ve been covering for some time. As studio technology continues to evolved, it’s hard to see how Geissler’s technical nous won’t sit front and centre of that drive forward.

TVBEurope Watch List 2021

Read the full list in their February 2021 magazine issue here >

The (augmented reality) elephant in the ballroom

TVBEurope gets an exclusive insight into how the Strictly Come Dancing graphics, lighting and audio teams have introduced augmented reality in this year’s series.

BY JENNY PRIESTLEY, TVBEUROPE
PUBLISHED: DECEMBER 10, 2020

It may be 16 years old, but this year’s Strictly Come Dancing has been as popular as ever. Viewers have taken to social media to express their joy at the show that’s brought sparkle back to our screens, and helped lift everyone’s spirits.

It’s not been easy to bring the ballroom back in the year of the pandemic, and the production team deserve all the plaudits they’ve received for their hard work to get the show on air. But rather than sit on their laurels, Strictly’s production team have been particularly innovative by bringing augmented reality into the mix. From the racing cars in week one, to the elephant that appeared during Bill Bailey’s Quickstep, augmented reality has featured every week during the live shows.

The use of AR has been a real team effort, involving both the lighting and audio teams, as well as companies, Mo-Sys and Potion Pictures. While this year is the first time AR has been employed in the show, it’s something the team has been considering for a while…

Read the full article on TVBEurope >

Mo-Sys Enables Spectacle for Live Music Extravaganza

Little Mix put into virtual stadium setting thanks to precision camera tracking

Mo-Sys took a central role in the Covid-safe production of a live online entertainment awards show. Normally held in a stadium venue, this year production company and media technologists Bild Studios used virtual reality with the intent of keeping the same scope and excitement as any physical show in the past.

Under the lead of Creative Director Paul Caslin and Production Designer Julio Himede from Yellow Studio, Bild created an enormous 360 virtual stadium cladded with LED screens all around – a design that wouldn’t have been achievable in the real world but more than possible in a virtual environment. Bild’s unique content workflows allowed the show content applied to the virtual LED screens to be operated live, on a cue per cue basis – just like in a physical environment. Hosts Little Mix were shot in a large green screen studio, which thanks to Mo-Sys VP Pro virtual studio software combined with Unreal Engine graphics, became filled with cheering music fans. The links were shot with three Arri cameras: one on a crane, one on a rail and one on a tripod.

Each camera was fitted with a Mo-Sys StarTracker camera tracking system. These track and capture a camera’s precise position and orientation in six axes thanks to infrared cameras tracking reflective dots – stars – on the ceiling. By also tracking the lens focus, zoom, and distortion, the live talent can be perfectly and seamlessly combined with the virtual world, irrespective of the camera movement or lens adjustment. All composited camera outputs were created in real-time, ready for the director to live-cut to, making the camera work for the virtual production environment not too dissimilar for any other live broadcast show.

“Producing the real-time VFX content and suppling the technical engineering for this event was a very great pleasure,” said David Bajt, Co-Founder & Director at Bild Studios. “Special thanks go to Mo-Sys Engineering for the amazing camera tracking and virtual production technology.”

Mo-Sys CTO James Uren, who acted as technical consultant on the production, explained, “This was a very fast turnaround production – just five days for any post-production work required – so we needed to get all the virtual studio materials in camera. Because we capture all the component video elements (green screen, key, and graphics for each camera) plus all camera movement and lens data for each camera, for maximum flexibility we could offer Bild the option of three workflows.

“First, we had the combined real and virtual elements from the three cameras, with the option of cleaning up the greenscreen in post,” he explained. “Second, if there had been a problem with the graphics, we could keep the live shots and replace the virtual background. And third, the post team could just go crazy and change everything, whilst still keeping the same camera/lens movements around Little Mix.”

Mo-Sys VP Pro is powerful software that plugs directly into the Unreal Engine Editor. It receives the camera and lens tracking data from StarTracker, synchronises this with the camera video and the Unreal Engine virtual graphics, and frame accurately merges the two image streams together to provide a real-time high quality composite output. VP Pro’s uncompromised graphics power was designed with the specific needs of live virtual production in mind. For the event, each Arri camera had its own VP Pro and Unreal Engine instance. This gave the producers the output quality they demanded, with very low latency. “This is not pre-visualisation,” Uren explained. “This is premium production quality.”

The Reality of the Situation

“It means that a studio can be readily converted from a product launch to a dealer webinar to a news broadcast with just a press of a button.” CEO Michael Geissler talks to TVB Europe about our innovative StarTracker camera tracking technology.

Even before the changes in production that were needed to cope with the effects of Covid-19, the family of realities – augmented reality, virtual reality and mixed realities – were becoming more and more common on the broadcast landscape. Recent events have seen new opportunities for the technology. Philip Stevens takes a look at a few of the recent developments in the ‘realities’ field and gets immersed in a virtual environment in this article in the latest edition of TVB Europe.

The article takes an in-depth look at the type of equipment required by virtual production studio. It describes how Mo-Sys StarTracker revolutionised VR and AR production and describes in detail the components of the new StarTracker Studio, the first preassembled production package that brings virtual 4K Ultra HD production within the reach of just about any studio which was launched in July this year.

“It means that a studio can be readily converted from a product launch to a dealer webinar to a news broadcast with just a press of a button,” explains Michael Geissler, CEO of Mo-Sys.

“The system is for film as well as TV workflows allowing filming of commercials, instant social media responses. Like the original system, it tracks the position of each camera in three-dimensional space across all six axes of movement. But beyond that, StarTracker Studio bundles the tracking with cameras and mounts and a high performance virtual graphics system based on the latest version of the Unreal Engine and the Ultimatte keyer. The unique plug-in interfaces directly between the camera tracking and the Unreal Engine, for extreme precision with virtually no latency.”

Read the article in full in the TVB Europe November/December issue below:

TVB Europe Mo-Sys StarTracker Studio
TVB EUROPE 78 NOVEMBER / DECEMBER 2020

Top Banner Image: The On-Set Facilities virtual production crew on a film
set in Cardiff, real-time compositing with Unreal Engine.

Extended Reality in the Virtual Environment

EVOKE Studios talks to TVB Europe about how they used our StarTracker “set and forget” solution at the AIM Awards 2020 to provide precise camera tracking and build complex extended reality environments.

EVOKE Studios is a new venture which brings together established designers and engineers to extend what is possible on stage. Their founder Vincent Steenhoek talks to TVB Europe about how the EVOKE team used the Mo-Sys StarTracker ‘set and forget’ camera tracking solution at this year’s AIM Awards, from the Association for Independent Music, on the 12th August 2020.

“Planning a project like the AIM Awards means we need to do considerable testing and plan with enough contingency. It also means that we need the supporting technology – like camera tracking – to just work. It has to be completely stable, no questions asked.

“That is why we used StarTracker from Mo-Sys. It is a ‘set and forget’ solution which uses dots on the ceiling that light up under ultra violet light; that’s why they call it StarTracker; once the star pattern is captured, it is there forever.

“We made the AIM videos at the new Deep Space studios at Creative Technology in Crawley. They have StarTracker set up on a jib, with all the connections to the lens and grip. What StarTracker gives us is a constant stream of extremely precise camera locations in three dimensional space, along with camera rotation and – from lens sensors – zoom and aperture (for depth of field). We used its translation, rotation and lens data inside the graphics servers (from disguise) to drive virtual cameras in Notch.

“Our experience with StarTracker is that it gives us ultra-reliable, highly accurate positional data with no encoder drift to speak of, and ultra low latency so we can create virtual elements and match them to the scene captured live by the camera. As XR consists of a lot of moving parts and separate calibrations, it helps a lot if there are parts of the process that are a constant.

“For EVOKE as a creative studio and systems integrator, we are enabled to do what we do by companies like MoSys. In turn, building on technologies like StarTracker enables awards shows like the AIMs to be presented in broadcast quality virtual environments.”

Read the article in full in the TVB Europe November/December issue below:

TVB Europe 78 November / December 2020  

Top Banner Image: Opening beach environment for Little Simz performing Pressure

Visualising the AR Roadmap

Commercial Director Mike Grieve talks to BroadcastPro Middle East about our StarTracker technology and explores the impact augmented reality is having on the broadcast industry.

Augmented Reality (AR) has brought new life to traditional screens by enabling broadcasters to extend their storytelling, thereby changing how consumers interact with content. BroadcastProME asks industry professionals to share their vision of how they foresee AR changing the face of broadcasting in various scenarios and how their own respective solutions can enhance the viewing experience.

Mike Grieve
Commercial Director, Mo-Sys

AR graphics are already used extensively in news, sports and weather storytelling. Children’s TV and mainstream drama, content that can be sold multiple times over to other broadcasters is probably where AR graphics provide the greatest return, both in terms of producing content that otherwise wouldn’t be possible, or if possible, it would be cost-prohibitive to make.

Mo-Sys manufactures StarTracker, a precision camera and lens tracking system, which is used to blend AR graphics with the real world. The system is also used for virtual studios, mixed reality, and extended reality. Mo-Sys also manufactures an Unreal Engine plugin called VP Pro, which enables all types of virtual production, and a range of encoded remote camera heads for wider, more varied AR shots.

AR graphics primarily require all the cameras used to be tracked, and all lenses on the cameras to have been profiled. Once this is done, one can choose which virtual studio software to use to synchronise and composite the AR graphics with the real world. Either a traditional virtual studio software package with automation and playout triggering, or where this isn’t required, an Unreal Engine plugin will work.

The biggest decision to make is whether the graphics operations team should be experts in Unreal, or experts in a traditional broadcast virtual studio software. This will determine the type of software that can be used to deliver the AR graphics. Choosing the camera, lens tracking system, and the camera grip comes after.

In terms of where AR is headed, greater photo-realism using technologies such as ray tracing is the obvious one. We will also begin to see more photo-realistic avatars, human or otherwise, driven by actors in motion capture suits with facial expression headsets, interacting with real actors.

The aim of broadcasters deploying AR is to create highly immersive content that’s visually appealing, which is ‘sticky’ in terms of viewer numbers and viewer duration, whilst also providing differentiation from the competition. The longer-term goal is for broadcasters to use AR to create increasingly sophisticated photo-realistic content that wouldn’t otherwise be possible.

BroadcastPro Middle East 124 November 2020