Virtual Production Takes a Leap Forward

In the latest issue of TVBEurope, Kevin Hilton talks to leading technology providers about how programme makers are using the new techniques available to them. Mo-Sys’ involvement in the BBC Sports Olympics studio was featured as an example of the use of robotic cameras, another component in virtual production.

Michael Geissler, founder and owner of Mo-Sys also discusses how today, with 18 months of on-set, real-time LED/XR production experience under our collective belts, the broadcast and film industries are quickly realising the need for ‘fit for purpose’ LED/XR technology.

In designing VP Pro XR and launching its Cinematic XR initaitve, set out to address the cinematic quality and functionality requirements of LED/XR for on-set, real-time production. The result is a product that is specifically designed to address the current on-set real-time quality and functionality challenges, provide a platform for unique features (e.g. Cinematic XR Focus), and to address known XR challenges, such as investment costs, expansion costs, and XR graphics delay.

– Michael Geisller, Mo-Sys Engineering
TVBEurope November/December 2021 issue

Read the full article here.

Robotic Camera Systems

Contributing Editor for InBroadcast November edition David Kirk gives us an update on some of the developments over the last year within the world of robotic camera systems.

The article features Mo-Sys StarTracker Studio, the world’s first pre-configured complete virtual studio system.  

Mo-Sys StarTracker Studio is specifically designed to provide a complete switched-camera system for users new to virtual production or without skills to create a VP studio solution from scratch.

The virtual production software used in StarTracker Studio is Mo-Sys’ VP Pro, which offers the same capabilities as traditional virtual studio software, but rather than being a layer on top of Unreal Engine, it’s embedded directly into the Unreal Engine’s editor interface.

Click here to read the full InBroadcast article, pg 16-18.

Virtual Sets Come of Age

For the InBroadcast November edition, Adrian Pennington explores how the virtual production phenomenon goes from strength to strength with new tools, greater photorealism, and lower prices driving demand.

Currently the industry is using live events LED technology and needs to transition to fit-for-purpose cinematic LED technology. Mo-Sys’ Cinematic XR initiative is aimed at driving for change in LED wall technology, output image quality, re-introduction of established shooting techniques, and smart workflows.

This is the first product on the market that changes an LED volume from just being an expensive backdrop, into something that can be integrated into the storytelling. It enables for the first time real interaction between the talent and objects in the virtual scene positioned virtually behind the LED wall.

Read the full InBroadcast article here.

Mo-Sys and APG Media Join Forces to Provide End-to-End Virtual Production Solutions 

Mo-Sys™ Engineering and APG Media, a leading LED volume rental provider and distributor, have partnered to provide customers with access to complete end-to-end LED virtual production set-ups, combining Mo-Sys’ StarTracker and VP Pro XR with APG Media’s HyperPixel, customised LED wall solutions. 

Mo-Sys LA Refinery

The Mo-Sys StarTracker camera/lens tracking system has become the technology of choice for leading-edge virtual productions. The advanced tools in VP Pro XR content server include the unique ability to pull focus seamlessly between real and virtual objects. By partnering with APG Media, Mo-Sys can now offer custom engineered LED tiles and a comprehensive package for tailored LED volume, multi-camera production systems. 
 
“The appetite for high production values means virtual production is no longer exclusive to big budget movies,” said Michael Geissler, CEO of Mo-Sys. “In cinematography, the quality of LED walls has to be as high as possible to deliver the best results and HyperPixel meets this requirement. Through this collaboration with APG Media we are removing the technical complexity from the equation and freeing up production teams to express their full creativity with an immaculate end result.”

APG Media

David Weatherhead, CEO at APG Media added “Virtual production is taking off as cinematographers and content producers recognise the impact this model can have on their final output – if they use the right technology. In a segment that is already at the cutting edge, Mo-Sys is a pioneer and a market leader, and this partnership brings huge value to both our customers. By combining our offerings, we can now open the doors to the best technology and help cinematographers to create the stunning content that audiences demand.” 
 
The new Mo-Sys Refinery in Los Angeles will feature a HyperPixel high resolution LED wall, featuring ultra-tight seams for the ultimate in immersive visual experiences. APG Media will add a Mo-Sys VP Pro XR and StarTracker to the offering within its rapidly growing specialist virtual production division. The two companies will cooperate on marketing and exhibition, as well as providing support for sales channels. 

Mo-Sys LA Refinery

HFR StarTracker and NearTime for LED boost ICVFX workflows

Mo-Sys Engineering reveals new solutions for LED virtual production with a high frame rate (HFR) StarTracker camera tracking system and the award-winning NearTime® workflow extended to LED volumes.

NearTime for LED is a smart solution to solve multiple challenges when shooting in-camera visual effects (ICVFX) in an LED volume. At its most basic level it is an automated background re-rendering service for improving the quality and/or resolution of Unreal based virtual production scenes, and is used simultaneously alongside a real-time ICVFX shoot.  

NearTime solves one of the key challenges of LED ICVFX shoots, which is balancing Unreal image quality whilst maintaining real-time frame rates. Currently every Unreal scene created for ICVFX has to be reduced in quality in order to guarantee real-time playback.

Using NearTime with an LED ‘green frustum’, the same Unreal scene can be automatically re-rendered with higher quality or resolution, and this version can replace the original background Unreal scene. Whilst it takes longer to do this, no traditional post-production costs or time have been used, plus moiré issues can be avoided completely!

Mo-Sys CEO Michael Geissler said, “We patented NearTime almost 8 years ago knowing that real-time graphics quality versus playback frame rate was always going to be an issue with ICVFX shoots. At that time we were focussing on green/blue screen use, but today LED volumes have exactly the same challenge.”

Mo-Sys is also announcing a special StarTracker for LED that provides camera and lens tracking up to 240fps for slo-mo shots, as would be typically used for fight scenes in high impact action films. In this scenario, rather than drive the LED wall at high frame rates to match the camera, which requires significant hardware processing and comes with an Unreal image quality impact, NearTime for LED is used to post-render the Unreal scene at HFR following the HFR ‘green frustum’ shoot of the talent performing. During the slo-mo shot review, the re-rendered Unreal background and the talent foreground are re-combined, delivering a significantly higher quality composited shot.

“Proxies were used as a smart solution to solving computer processing limits in the early days of digital compositing and colour grading,” Geissler commented. “Similarly, HFR StarTracker and NearTime for LED is a smart solution to shooting slo-mo shots in an LED volume.”

The HFR StarTracker for LED has been designed in anticipation of future GPU processing improvements enabling higher frame rate Unreal scene playback. In conjunction with Mo-Sys’ VP Pro XR content server, the HFR StarTracker’s programmable phase shift feature can utilise Brompton Technology’s Frame Remapping or Megapixel VR’s GhostFrame capability for simultaneous multi-camera shoots in an LED volume.

NearTime for LED will be available in Q4 to Mo-Sys customers with VP Pro and VP Pro XR. The HFR StarTracker for LED will be available early Q1 2022.

Mo-Sys Refinery Offers a Unique VP Testing Facility for Cinematographers

We are delighted with the hugely successful launch of the Mo-Sys Refinery, our cutting-edge virtual production (VP) testing space, specifically designed to help experienced professionals test and evaluate virtual production.

Mo-Sys Refinery is a first of its kind offering. The concept is to provide a testing space for Cinematographers and content producers to de-risk and refine their virtual production concepts, allowing them to experiment with virtual production techniques and workflows over a number of days.

With both a green screen and LED volume in one studio, we are inviting creatives to consider utilizing a hybrid virtual production methodology, where shots benefit from the strengths of each screen technology. Following on from the Grand Opening recently, these new ideas have been met with keen interest and excitement by LA-based camera people, VFX/VP Supervisors, and the post-production community.

This new mix “is a suggestion we’re waiting for the industry to pick up” Michael Geissler explained to Mark London Williams, who featured the mix of combining LED walls with traditional green screen technology that were on display at the event in an article discussing cinematography technology. The article went on to discuss what could be done to ensure the future safety of on-set crews following the tragic death of Halyna Hutchins (read the full article here).

We’re encouraging clients to test out this hybrid VP approach, incorporating both wide shots and close-up, detailed shots – using DMX controlled lighting – in a single environment.

The Refinery’s virtual production stage can be adapted to meet clients’ specific requirements, with full technical support provided on site. Mo-Sys Refinery also allows clients to collaborate with the post houses of their choice to bring in the creative services they need.

Mo-Sys Refinery LA virtual production VP

This innovative VP space also offers new virtual production technology for clients to test and evaluate, including:

Real-time MoCap for virtual character programming - drive background virtual character movement using multiple pass real-time MoCap.

Cinematic XR Focus- rack focus from the real world to the virtual world (a world first !) without changing the Focus Puller’s tools.

Scene linked lighting - match real and virtual lighting using UE4 DMX controlled lighting.

Dual screen VP shooting - shoot using both green screen and LED walls, using each screen technology to its strengths.

Neartime® - a brand new dual render workflow enabling automated high-quality re-render without the cost of post-production (Winner of HPA Engineering Excellence Award 2021).

Colour matching – match the colours between the real set and the virtual world. 

VP Pro XR- purpose-designed content server for XR volumes (LED set extensions), available as software-only or as a configured system. 

StarTracker– the most accurate and resilient camera/lens tracking system. 

Cine Gear 2021 Technical Award for Cinematic XR Focus

Mo-Sys Engineering has solved a major creative limitation when using an LED volume for virtual production. Cinematographers who needed to pull focus between real foreground objects – such as actors – and virtual objects displayed on an LED wall – such as a deer, far away in the virtual world as in the example in the video below – have been unable to do this as the lens focal plane stops at the LED wall, meaning the deer always remains out of focus.

The new Cinematic XR Focus has been recognised in the Cine Gear 2021 Technical Awards for its innovative technology.

For the first time, an LED wall can be used as more than just a backdrop and can instead integrate with the real stage. Cinematographers can now seamlessly rack focus deep into the virtual world and create layered images to enhance their visual storytelling, saving time and money by combining shots in a way that they will be familiar with.

James Uren, technical Director at Mo-Sys and inventor of Cinematic XR explains, “Traditionally, the LED wall has been a passive part of the wider set, but we are now empowering cinematographers to turn this feature into an asset that enhances and strengthens the overall story.”

This is achieved intuitively using the same wireless lens control system commonly used in film-making and is compatible with Preston wireless lens controllers (Hand Unit 3 and MDR-3). The lens controller is synchronized with the output of the Unreal Engine graphics, working with Mo-Sys’ StarTracker camera tracking technology to constantly track the distance between the camera and the LED wall.

Cinematic XR Focus is just one of many key capabilities that Mo-Sys is adding to its VP Pro XR, a purpose-built scalable XR server, that enables the use of traditional shooting techniques within virtual productions.

See Cinematic XR Focus in action in the video below:

Highly Commended for Strictly Augmented Reality at AV Awards

On Friday 5 November, AV Awards announced the 2021 winners. Mo-Sys is delighted to receive recognition in the Broadcast and Media Project of the Year category. Mo-Sys, Potion Pictures, BBC Studioworks and Lighting Designer, David Bishop were Highly Commended for Augmented Reality in the Strictly Ballroom.

Click here to read more from the AV Awards announcement.

Strictly Augmented Reality

In the UK, Strictly is the most popular entertainment program on television. For the 2020 season, the producers wanted to add even more glitz and glamour, and were considering upping the graphics game even before the full extent of the Covid-19 pandemic meant the production plan had to be completely rethought.

From steam trains to elephants, the 2020 series of Strictly Come Dancing went all out to provide the visual wow factor and make up for the lack of a live audience. Potion Pictures delivered a series of exciting and extravagant graphics using Mo-Sys StarTracker and VP Pro.

We found that Mo-Sys’s StarTracker technology and VP-Pro Plug-In was the perfect combination of hardware and software to bring AR to life on Strictly Come Dancing.  Mo-Sys also provided onsite support for every live show, and helped push the boundaries of what was achievable.  The results visibly improved on a weekly basis as the BBC production team grew increasingly ambitious with their ideas.  The Mo-Sys team of engineers and technicians were invaluable for providing expertise and enthusiasm for the show.

David Newton, Managing Director of Potion Pictures

Read more about the Strictly Case Study here.

Space Race: Infinite Possibilities of Virtual Production

Denis Villeneuve’s interpretation of the classic 1965 Frank Herbert novel will see Dune adapted to the big screen using ground-breaking artificial backdrops to bring Dune’s feudal interstellar society to life.

At the forefront of the film’s technology is DNEG, a London-based VFX house whose virtual production credits already included Interstellar and First Man – for which it received Academy Awards.

Driven by the impact of Covid-19, DNEG is continuing to develop through exciting partnerships. With Dimension, Unreal Engine, Arri, Mo-Sys, 80six, Roe and Brompton Technology, DNEG is shooting a proof-of-concept virtual production test, led by creative director Paul Franklin.

Click to read the full Definition article.

Head to DNEG here to learn more about the collaboration, go behind the scenes at the test shoot and even get a sneak peek at some final footage captured during the test.

The Faraway Nearby – Behind the Scenes

Head behind the scenes of The Faraway Nearby and find out how state-of-the-art virtual technology enables filmmakers to tell the story of Joseph Weber – the first scientist to explore the detection of gravitational waves. The Faraway Nearby team illustrate the power of virtual production, in its ability to bring the narrative to life in such a way that it becomes difficult to tell what is real and what is not.

The Mo-Sys StarTracker is the glue that pulls virtual production together. Accurate, reliable camera tracking is essential to virtual production. We set the Mo-Sys [StarTracker] up and it runs all day long with no problems. That’s what I want from a film shoot.

Todd Freese, Resolution Productions Group SVP/CTO

The Mo-Sys StarTracker camera tracking technology is relied upon to bring this production together, facilitating a level of creative and convincing storytelling that simply could not be conveyed through traditional documentary-making means.

Watch the full video below: