Mo-Sys Partners With Assimilate To Integrate StarTracker With Live FX

Mo-Sys Engineering today announces that its precision camera tracking solution, StarTracker, is now fully supported by Assimilate’s Live FX software. The integration between StarTracker and Live FX allows for a faster way to create high end content with green-screen and LED wall-based virtual productions.

Both companies have a similar approach, with a focus on allowing filmmakers and artists to work in a way that is familiar to them rather than adapting to a program environment within Unreal Engine (or other game engines). The Live FX software, which enables realtime, live compositing for green-screen and LED wall-based virtual productions on set, when combined with Mo-Sys StarTracker, delivers a one-stop-shop software solution for all kinds of virtual production workflows. Creatives gain the added benefit of a workflow that speaks their language and is much easier to learn.

“We are delighted to give customers access to a complete solution to accurately deliver high quality tracking data and compositing assets directly to VFX/post workflows,” said Michael Geissler, CEO of Mo-Sys. “With demand for virtual production rising rapidly, filmmakers and cinematographers can deliver the best results without the need to learn a totally new way of doing things.”

The integrated solution delivered by Mo-Sys and Assimilate allows users to efficiently and accurately composite 2D and 3D elements into the live camera signal, control (and colorgrade) LED walls, control stage lighting based on image content, use Live FX as a simple keyer in green-screen scenarios, and link it up with Unreal Engine if needed.

Jeff Edson, CEO at Assimilate commented, “For Assimilate, it has always been important to create streamlined workflows at the very high end that our users can rely on. We are proud to offer the best possible camera tracking in Live FX with the Mo-Sys StarTracker; together they create an unbeatable, highly accurate live-compositing system with a streamlined connection to post. The support provided by Mo-Sys to help with the implementation on our side was especially valuable to us.”

VP Pro XR making waves in VP world

Mo-Sys’ VP Pro XR system has been turning heads since it was launched, snapping up awards and earning endorsements from some well-known names in the industry.

Pocket Films met up with Mike Grieve, Mo-Sys Commercial Director, at the SMPTE UK visits… an On-set Virtual Production event to hear more about the technology’s success to date.

Mike Grieve Studio Map
Mike Grieve being filmed at Garden Studios

Tell us a little bit about VP Pro XR and the challenges that this technology is designed to meet.

VP Pro XR, which won a Production Hub Award this year and picked up the NAB Product of the Year 2021 accolade, supports XR production in LED volumes for real-time VFX film and TV production. We designed the technology with the aim of addressing the restrictions and compromises that producers currently encounter with XR stages.

Although XR volumes deliver cost savings by minimizing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that some find is not yet comparable to non-real-time compositing. This is the challenge that motivated the development of VP Pro XR.

VP Pro XR is an LED content server solution, purpose-designed to offer Cinematic XR capability, with a focus on composite image quality whilst offering unique cinematic features.

VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative, can you tell us more about this?

The aim of our Cinematic XR initiative is to move final pixel XR production forwards, in terms of image quality and shooting creativity, from its initial roots using live event LED equipment to purpose-built Cinematic XR.

Cinematic XR is focused on improving image fidelity, and introducing established cinematic shooting techniques to XR. It promotes the interaction between virtual and real set elements, and the development of new hybrid workflows combining virtual and non-real-time compositing. These capabilities are being designed into the VP Pro XR platform.

First among these capabilities is Cinematic XR Focus, which has been recognised in the Cine Gear 2021 Technical Awards for its innovative technology.

Traditionally, the LED wall has been a passive part of the wider set, but now for the first time, we are enabling an LED wall to be used as more than just a backdrop. Instead, cinematographers can integrate LED wall visuals with the real stage and seamlessly rack focus deep into the virtual world to create layered images that enhance their visual storytelling.

Cinematic XR Focus synchronises a physical focus pull with a virtual focus pull, managing the transition at the LED wall so that the complete focus plane movement is seamless. It requires no additional action on behalf of the Focus Puller, and enables them to use a standard wireless lens controller – currently Preston but with other options coming soon.

NearTime is another pillar of Cinematic XR and was awarded the 2021 HPA Awards for Engineering Excellence by the Hollywood Professional Association (HPA). This feature solves one of the key challenges of LED ICVFX shoots, balancing Unreal image quality whilst maintaining real-time frame rates. Currently, every Unreal scene created for ICVFX has to be reduced in quality in order to guarantee real-time playback. Combining NearTime with an LED ‘smart green’ frustum, the same Unreal scene can be automatically re-rendered with higher quality or resolution, and this version can replace the original background Unreal scene.

NearTime operates in parallel to an ICVFX shoot and starts at the same time. It finishes after the real-time shoot but still within the real-time delivery window, meaning that a higher quality result can be delivered in the same time, with minimal increase in cost. No traditional post-production costs or time are required, plus moiré issues completely avoided.

How would you describe the industry’s reaction to VP Pro XR?

We have had a phenomenal response from customers and partners to VP Pro XR. ARRI Rental UK was the first to install and use this innovative technology at its new mixed reality studio in Uxbridge, west of London along with our StarTracker system. This is now one of the largest permanent LED volume studios in Europe and we are so proud that Mo-Sys technology lies at the heart of it. Our technology is designed specifically for real-time final pixel shooting, to deliver cinematic quality imagery, with clever unique features and technology geared to Cinematographers.

VP Pro XR has also been deployed by London’s Garden Studios for its state-of-the-art virtual production (VP) stage installed in March 2021. Garden Studios, which already has a Mo-Sys StarTracker system and uses the VP Pro software for its main LED volume, now benefits from set extension (XR) capability, offering clients additional creative capability. The facility recently added a second StarTracker camera and lens tracking system to support its ongoing R&D and expansion.

We have also picked up a number of prestigious industry awards for VP Pro XR as well as for new features like Cinematic XR Focus and we could not be prouder. The VP sector is in an extremely exciting stage in its evolution, and we are thrilled to be at the heart of this new direction.

This article was first published on The Studio Map website.

Mo-Sys sponsors deep dive into on-set Virtual Production by SMPTE

Mo-Sys was the lead sponsor of a hybrid three-day event showcasing in-camera VFX and Virtual Production hosted by SMPTE UK. The event featured demos of the real-time technologies and was the latest in the SMPTE UK Section’s monthly events. 

Bringing together 80six, ETC, Solent University, Brompton Technology, Mo-Sys and other collaborators, the event gave attendees the opportunity to get a close up look at on-set Virtual Production. As part of this showcase, leading-edge technologies from the 80six’ inventory, including the Mo-Sys StarTracker, were highlighted in a production-ready setting.

The whole event was live streamed to a global audience. Commercial Director Mike Grieve gave a presentation explaining ways in which Mo-Sys technology enables precise camera tracking and introduced the new Brompton digital tracking markers for use in LED volumes.

The full event is available to watch here

VP is the winning technology for Winter Games

BBC Sport’s virtual studio set for the Summer Games in Tokyo last year remains a talking point, showing just what can be achieved with virtual production (VP).  

For the Winter Games event in Beijing, VP is once again very much front and centre for both Discovery and the BBC allowing the broadcasters to work around restrictions on roaming film crews.

Discovery put Mo-Sys technology at the heart of its ‘Cube’ presentation virtual studio, which has undergone an evolution since it was first used during the Tokyo Games. All the greenscreen studios for the Cube make use of Mo-Sys’ tracking software, which allows multiple studios to sync precisely. Mo-Sys camera tracking systems were also installed at the broadcaster’s greenscreens around Europe to ensure all the locations have identical timing, and perfectly synchronised precise tracking data, with accuracy to one millionth of a degree. These multiple greenscreens also allow athletes will also be ‘beamed’ in live for three-dimensional interview from Beijing. 

BBC Sport has once again created a new VP set for the Winter Games, that will take viewers from a mountainside log cabin to a virtual ski resort. To match the smaller scale of a Winter Games, compared to a Summer Games, the broadcaster has scaled its VR set back with a more compact studio. Although only 84 metres square, the studio will still bring a virtual ski resort, more than five different presenting positions – both inside and outside – to life as well as reflecting various times of the day. While smaller than the Dock 10 studio used for Tokyo, this is still the largest virtual footprint the broadcaster has used to date. 

Brompton’s digital stars maximize LED performance

LED video processing products manufacturer, Brompton Technology, has outlined features coming with Tessera v3.3 software, designed to optimize the performance of LED panels including markers for use with the Mo-Sys StarTracker.

Brompton has added support for Mo-Sys’ StarTracker when users are employing Tessera R2-based panels together with Tessera SX40 or S8 LED processors and Frame Remapping.

Although there are many different systems for tracking camera position, a common approach has been using visible markers, which can be challenging within an LED volume, especially one with an LED ceiling. This can now be resolved by displaying suitable markers on the LED itself.

Chris Deighton, Brompton Chief Technology Officer

The markers are overlaid on the video content being displayed, with Tessera’s Frame Remapping feature used to only display the markers on output frames that are not visible to the main camera. The markers are automatically generated, with the ability to configure the color and size of the markers from within the processor UI.

A beta version of the v3.3 software is now available to download from the Brompton Technology website.

Read the full article here >

Mo-Sys to demo ground-breaking VP workflow at HPA Tech Retreat

Mo-Sys Engineering will join forces with Moxion, QTAKE and OVIDE to show a ground-breaking collaborative virtual production workflow at the HPA Tech Retreat 2022. As a Gold Sponsor of the in-person event, Mo-Sys will take part in the Supersession entitled “Immerse Yourself in Virtual Production”.

Located in the Innovation Zone, Mo-Sys, Moxion and QTAKE will have a combined stand featuring a Sony Crystal LED wall along with a Sony Venice camera attached to a curve rail. The camera will be equipped with a Mo-Sys StarTracker camera tracking system, and Mo-Sys’ VP Pro XR LED content server will drive the LED wall.

Takes will be captured by QTAKE, the industry’s preferred onset capture tool, whilst VP Pro XR will drive the LED wall, capturing the camera and lens tracking data. The new Mo-Sys NearTime® service, a 2021 HPA Engineering Excellence award winner, will be used to solve one of the key challenges of LED in-camera VFX (ICVFX) production – increasing real-time Unreal image quality versus maintaining real-time frame rates. Higher quality re-renders from NearTime, which utilizes cloud processing from AWS, will then be automatically delivered back to a Moxion Immediates solution, winners of the 2020 HPA Engineering Excellence Award, for review and signoff.

Using NearTime in an LED volume, with a Mo-Sys enabled ‘halo green’ frustum for separating talent from the LED content, the background Unreal scenes can be automatically re-rendered with higher quality or resolution using captured camera and lens tracking data, and then used to replace the original lower quality background Unreal scenes. This process avoids the need for large quantities of on-set rendering nodes, minimizes post-production costs, and, significantly, eliminates moiré effects completely. This unique approach enables far more efficient workflows than those that exist today.

“We believe there is a better way to bring virtual productions to life cost-effectively and without compromising on image quality,” commented Michael Geissler, CEO of Mo-Sys. “We are excited to meet with customers face-to-face once more at the HPA Tech Retreat and to show them workflows that can take their virtual productions to the next level while allowing them to work with other tools that they are already familiar with. We are proud to partner with some of the leading names in the industry to show what is possible with virtual production today.”

The VP Supersession taking place Tuesday, February 22 will walk attendees through virtual production from conception to delivery, with a special emphasis on the planning and preparation that are critical to a successful outcome. Three LED walls will be showcased from AOTO, Planar and Sony, two of which will feature the Mo-Sys StarTracker camera and lens tracking system. One of the LED walls will be driven by Mo-Sys’ VP Pro XR LED content server, which will also be showing its Cinematic XR Focus feature for pulling focus between real and virtual objects positioned virtually ‘behind’ the wall.

At HPA Tech retreat 2022 Mo-Sys will also host a series of breakfast roundtables, around the theme “Post-production workflows for LED volumes” aimed at helping cinematographers and creative professionals elevating virtual production.

Cinematic broadcasting

In the latest DEFINITION issue, Phil Rhodes explores the crossover between TV and film caused by the current push for more capability from broadcast cameras. The combination of the Mo-Sys StarTracker and Unreal Engine plug-in were used on Strictly Come Dancing to create AR elements for a studio emptied of its audience due to the pandemic.

“Covid-19 accelerated implementation of new broadcast technology, with Strictly Come Dancing feeling the Mo-Sys magic”.

Phil Rhodes, DEFINITION magazine.
Click here to read the full DEFINITION article pg 49.

Read more about Mo-Sys and Strictly Come Dancing >

AV and Broadcast – is there any difference?

AV Magazine explores the rapid cross-pollination of tech and workflows between AV and Broadcast industries, mainly pushed by the global pandemic. Traditionally, the workflow and technologies used for broadcast and AV were viewed as distinct markets. Recently, these elements have become part of the same environment.

“Over time, the gear has become a lot less expensive and it means a company like M&S can afford to buy the same kit as a local ITV channel”.

Mike Grieve, Mo-Sys Commercial Director
Read the full AV article here