One Giant Leap

In the latest issue of the DEFINITION magazine, Chelsea Fearnley discusses how SMPTE is driving the future of virtual production by bringing together the best minds in the industry to explore the most exciting advancements in media and entertainment tech.

Mo-Sys Definition Magazine
CLICK HERE TO READ THE FULL DEFINITION ARTICLE PG 25.

The hybrid three-day event showcasing in-camera VFX and Virtual Production hosted by SMPTE and sponsored by Mo-Sys, gave viewers an interactive experience in the world of the virtual set.

A pop-up LED volume was constructed by the team behind Virtual Production Studios at 80six, and comprised cutting-edge products from 80six’s inventory, including two Roe Visual Diamond LED screens with processing by Brompton Technology, Unreal Engine, the Mo-Sys StarTracker system, Disguise’s VX content servers and ETC lighting control console. Together, they enabled a smooth virtual production workflow.

Chelsea Fearnley, DEFINITION MAGAZINE.

Read the full article here (Pgs 25-27) >

Rob Fowler, director of business development at Brompton Technology, spoke to DEFINITION magazine about the success of the SMPTE On-set Virtual Production event in a follow-up article on pgs 28-29 of the same edition.

We already support Mo-Sys, which provided the tracking system at the SMPTE event. This means that instead of using physical markers (usually dots on floors or ceilings) to locate the tracking camera, we can embed those markers into content on the walls to be visible to the tracking camera, but invisible to the cinema camera.

Rob Fowler, Director of business development at Brompton Technology

Find out more about Brompton’s new digital tracking markers for LED volumes here >

Mo-Sys to set up virtual production hub in London

Mo-Sys recently spoke about its plans to renovate and build virtual production stages and a virtual production research center with The Hollywood Reporter. The Plumstead Power station in Greenwich, which has sat unused for 50 years, will house eight stages, serve as the home to a new virtual production festival and become Mo-Sys’ headquarters in the UK.

The Plumstead Power Station in Greenwich which Mo-Sys will convert into virtual production hub.

According to Mo-Sys CEO Michael Geissler, the company — which last year won a Hollywood Professional Association Award for Engineering Excellence for its NearTime rendering workflow technology — plans to invest in the region of $7.2 million in the Plumstead Power Station in Greenwich and has secured a grant of around $5.5 million from local authorities as part of efforts to regenerate the area.

CAROLYN GIARDINA & ALEX RITMAN, The Hollywood Reporter

Read the full article published by The Hollywood Reporter March 24, 2022

On-set auto-re-rendering for ICVFX workflows with NearTime™️

Mo-Sys Engineering recently joined forces with MoxionQTAKE and OVIDE to show a ground-breaking collaborative virtual production workflow at the HPA Tech Retreat 2022.

The new Mo-Sys NearTime® service solves one of the key challenges of LED in-camera VFX (ICVFX) production – increasing real-time Unreal image quality versus maintaining real-time frame rates.

Watch Mo-Sys Technical Director James Uren demonstrate the workflow in the video above.

Advantages?

  • Higher quality real-time VFX shots – increased real-time VFX quality and/or resolution, reviewable onset, for minimal additional cost and time
  • Ready within ICVFX delivery window – deliver higher quality real-time VFX shots in the same real-time VFX delivery window
  • Fully automated workflow – no operator intervention, no post-production required
  • Uses lower cost LED volumes – no need for fine pixel pitch LED tiles – the LED volume is now just for scene lighting on the talent
  • Max scene lighting/Minimal green spill – ‘Smart green’ gives just enough green behind talent to separate them, enabling the maximum amount of the LED volume to be used for creating scene lighting on the talent
  • No LED colour gamut issues – re-rendered Unreal scenes are not displayed in the LED volume
  • No Moiré patterning – re-rendered Unreal scenes are not displayed in the LED volume, and are not captured by the camera
  • Any VP screen technology – can be used on VP productions shooting in green, blue, or LED volumes
Mo-Sys Neartime workflow
ICVFX workflows with NearTime™️

How does it work?

  • Mo-Sys – VP Pro XR/StarTracker provides onset real-time VFX composite preview
  • Qtake/Ovide – captures each camera take of talent shot against LED volume, with minimal green surround, and creates an alpha of the talent
  • Mo-Sys – NearTime sends camera tracking and lens data to the cloud where the Unreal background plates are re-rendered with increased quality and/or resolution
  • Moxion – NearTime Unreal re-renders land back in Moxion, and are composited together using the alpha, with the resultant composite made available for review
NearTime™️
ICVFX Shoot
ICVFX shoot
NearTime

Requirements?

Want to know more? Get in touch with us >

Mo-Sys to show latest Virtual Production Innovation at NAB

Mo-Sys Engineering will expose more of its virtual production technology stack at NAB Show 2022 (Las Vegas Convention Centre, 24 – 27 April), where this year it is co-exhibiting with APG and Fujifilm on stand C6127.

Mo-Sys NAB 2022

Mo-Sys, using Fujifilm lenses and a state-of-the-art 1.5mm pixel pitch LED wall from APG, will show the following technology:

LED virtual production

Using its VP Pro XR LED content server and StarTracker camera tracking technology, Mo-Sys will show its end-to-end LED production workflow, highlighting the benefits of designing a solution specifically for cinematic and broadcast virtual production. In addition, the team will show the latest multi-camera switching feature for VP Pro XR, along with Cinematic XR Focus for pulling focus between real and virtual elements, managed by a TeradekRT wireless lens controller, as part of a new collaboration between Vitec and Mo-Sys.

Solving real-time VFX graphics quality

NearTime® is Mo-Sys’ patent-pending and HPA Engineering award-winning solution for solving real-time VFX virtual production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, when combined deliver higher quality real-time VFX content. NearTime also removes Moiré patterning completely and enables the use of lower cost LED panels to deliver an image quality that’s far closer to post-production compositing.

AR for sports

The new heavy-duty Mo-Sys U50 remote head will be shown with Fujifilm’s latest box lens using a Vinten 750i remote head with pan bars. Visitors to the stand will be able to experience the smooth precise motion of the U50 combined with the immediate response of the V750i, providing operators with the best possible experience.

In addition, a new Mo-Sys camera plate for the Vinten 750 head, containing Mo-Sys’ precision encoders for camera and lens tracking, will be shown for the first time. This new camera plate simplifies adding camera tracking capability to a static sports/event camera position for delivering precision blended AR graphics.

Robotics for virtual production

Mo-Sys will also show its new G30 gyro-stabilized head, offering a unique combination of tech-less setup, rapid accelerated movement of the heaviest camera payloads, and a smooth stabilized image. Additionally, Mo-Sys will demonstrate its industry-standard L40 cinematic remote head. Both remote heads can be equipped with encoded outputs for 3-axes.