CES 2020: Sony exhibit next generation in-camera VFX workflow with Mo-Sys

Jan 13, 2020

CES 2020: Sony exhibit next generation in-camera VFX workflow with Mo-Sys

Using Mo-Sys StarTracker and the power of Unreal Engine, Sony Pictures introduces a new era for Film and TV production with their in-camera VFX workflow

At CES 2020, Sony Pictures demonstrated an exclusive virtual production showcase transforming how motion pictures and TV shows are made. Bringing together Sony’s Atom View and volumetric capturing technology, Unreal Engine and Mo-Sys StarTracker, their virtual production workflow endorses a new method for filming photo-realistic CG content.

https://www.youtube.com/watch?v=7e-e1wa5qck&amp=&feature=youtu.be

Combining the latest mixed-reality tools from Mo-Sys and the Unreal Engine, Sony showed how both the filmmaker and the actors can connect directly with content in real-time. Using a large LED-wall, Sony extended the physical set with a virtual environment which produced realistic set lighting and natural reflections all in real-time. By using the camera tracking data provided by StarTracker, the 3D LED-wall background could change with the movement of the camera, ensuring the final in-camera image has the all-important realistic perspective and depth that you would expect when shooting for real.

StarTracker uses a direct tracking plugin for the Unreal Engine which streams live motion and lens data to the engine via Live Link. The data can then be used as real-time camera tracking for VFX, virtual studios, game cinematics, architectural visualization or even as tracking data for VR headsets.

Sony’s demo at CES 2020 was based on a film set from the upcoming Ghostbusters reboot, featuring the iconic ECT-1 vehicle in a street scene with a virtual environment displayed on a large LED-wall behind. The virtual background scene was created by scanning a real, physical set at Sony Pictures studio using their volumetric capturing technology. It was then virtually reconstructed in 3D, rendered in real-time in photorealistic quality through Sony's Atom View plugin for Unreal Engine, and then displayed on a crystal LED display.

Kenichiro Yoshida, President & CEO Sony Corporation said:

“With volumetric capturing technology, filmmakers now have creative flexibility to capture locations and sets in 3D and then shoot them later in real-time with photo-realistic results.”

Michael Geissler, CEO of Mo-Sys added:

“Sony Pictures are really pushing boundaries in the virtual production field. Their volumetric capture technology opens new possibilities for filmmakers, especially for those looking to use real-world locations for their productions. With the addition of StarTracker, Sony’s workflow brings the creative choices back to the director and cinematographer on-set.”

At Mo-Sys we manufacture advanced camera robotics, film remote systems and high-precision camera tracking solutions for the film and broadcast industries. Our comprehensive range of products has been implemented by the biggest broadcasters and respected filmmakers worldwide, with tools built on patented and industry proven StarTracker technology which has now been installed in more 100 studios worldwide.

Get the latest on Mo-Sys

Enter your email to get the latest news from Mo-Sys Engineering

2024 © Mo-Sys Engineering Ltd. All rights reserved
Start Typing and press enter to search