Mo-Sys Academy Virtual Production courses announced

Mo-Sys Academy announces new Virtual Production courses and aims to close the skills gap in the virtual production sector as it faces surging demand for trained technicians.

Mo-Sys Academy

Mo-Sys Engineering today announces that it has released a new line-up of guided Virtual Production training courses. This improved extensive programme has been carefully developed and will be delivered by Mo-Sys’ Academy at its London HQ through summer 2022.

With limited availability, demand is expected to be exceptionally high from broadcast and film industry professionals wishing to gain valuable Virtual Production experience, university lecturers upskilling and students alike for what is set to be the most comprehensive practical Virtual Production training on the planet. 

Mo-Sys Academy Virtual Production courses
Mo-Sys Academy

Multiple courses for all levels have been released starting with a 3-day introduction to Virtual Production to an intensive full Virtual Production foundation course over 10-days. Delivered by skilled on-set technicians, summer course dates start from 15th June and run until 15th August 2022. Mo-Sys’ Academy training incorporates the entire Virtual Production spectrum from green screen, augmented reality (AR), computer generated imagery (CGI), motion capture, XR and LED. Learning takes place in a supportive and friendly environment with small group creative exercises throughout. 
 
Course attendees will gain significant access to the latest Virtual Production tools and techniques, including working with the world’s leading camera tracking system, Mo-Sys StarTracker, understanding lighting requirements for green screen and LED production and discovering how to run virtual productions using Unreal Engine as part of a workflow leveraging LED volumes for in-camera visual effects (ICVFX). 
 
Demand for Virtual Production has exploded in recent years and with that, the industry requirement for experienced VP talent has grown in equal measure. Mo-Sys’ Academy has the unrivalled experience and knowledge to guide students to the forefront of the broadcast and film industry.  

“There has been a boom in Virtual Production, and the greatest challenge facing the industry is finding people who understand LED volumes, on-set pre-visualization and XR shooting. These are relatively new techniques and there is a shortage of trained technicians who understand the unique challenges that come with this new and exciting way of creating content,” commented Michael Geissler, CEO of Mo-Sys. “Mo-Sys Academy was created to address the skills bottleneck the industry is facing, and to transfer the knowledge Mo-Sys has gained over the last 25 years.” 

Mo-Sys is also working with universities, such as the University of Sunderland who recently announced a major £1.4m technology refresh. Mo-Sys partner, CJP Broadcast Services, has installed state-of-the-art Virtual Production technology, making Sunderland a powerhouse with standout media courses which will benefit students for years to come. In support of this upgrade to the latest LED volume production technology and tools, Mo-Sys Academy provided Virtual Production training for university staff. 
 
Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland added “This two-week course was brilliant! From the first day to the last it was packed with information and fantastic knowledge delivered by welcoming and friendly tutors in Juliette and Dominic. This was supported by experts who came into our sessions and helped us reach another level of understanding. I cannot recommend this course enough to university departments thinking about installing or who already have Mo-Sys technology. The course takes Virtual Production from theory into practical reality. Before the course, I had no prior experience in Virtual Production and was extremely nervous. After the course, I feel incredibly confident about working in Virtual Production.” 

For more information, please visit Mo-Sys Academy.

Creatives and Directors: What Can You Achieve Creatively in Virtual Production?

Mo-Sys Engineering’s commercial director Mike Grieve on how virtual production can elevate creativity and save resources

In the virtual world, possibilities are endless. Unlike real sets where you’re limited to the physical attributes of set design, virtual sets are built in Unreal Engine where you can be anywhere and have anything. Creatively, it breaks you free from budget, time and location limitations.

Stop Fixing in Post, Solve It in Pre

One of the key attributes of virtual production is the ability to pre-visualise everything before you even get to set. You can “walk” through your virtual scene and make decisions on where the best camera angles are, change lens types and adjust lighting. And with everyone from the director and producer, to the cinematographer and VFX supervisor having the ability to be together, looking at the same 3D scene from anywhere in the world, decisions can be made far more quickly and easily. So when you turn up on the day, all you need to do is light and shoot.

You don’t get that level of foresight on a physical shoot. Virtual production swaps basic preparation and fixing things in post, for high level prep by solving things in pre-production.

Not only that, but now that talent can actually see the virtual set around them – using an LED volume – rather than imagining where they need to look and interact using a green screen, you can shoot far more accurately. This helps avoid errors on things like eyelines between talent and virtual elements.

When you look at the whole production process, from pre-production to the actual deliverable, virtual production shrinks the overall production time and costs by reducing post-production needs. The bottom line is, it’s better to solve problems in pre than try to fix them in post.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Shoot In-Camera Effects in Real-Time

The quality of the 3D scene created for a virtual production shoot, is always very, very good. But when the scene is loaded into the computer stack running Unreal Engine, and camera tracking is attached, the scene, more often than not, doesn’t play back in real-time. This is because the scene can’t be processed fast enough.

When this happens, the scene needs to be ‘optimised’, which is a bit like video compression that shrinks down the file size. When the processing load goes down, the frame rate comes up allowing the scene to play back in real-time and the real-time VFX shoot to happen.

The problem then is that the quality level of Unreal scenes is fixed. Because if you try to add any more quality, the frame rate drops below real-time and you can’t shoot in-camera effects. This is a well known problem.

What normally happens is that a director or producer will then need to decide which shots will need to go to post-production for compositing to increase the quality of the background. That takes time and money. But not only that, it actually goes against the whole principle of virtual production which aims to cut down compositing time as much as possible.

At Mo-Sys, we’ve patented a solution to this called Neartime. It’s a service that runs in parallel with a real-time VFX LED shoot, that auto re-renders the background virtual scene at higher quality, enabling it to be composited back together with the keyed talent, so you can deliver a much higher quality product in the same delivery window.

So as soon as you start the camera to do the first shot, all of the tracking and lens data from the camera is thrown up into the Cloud, where that same Unreal scene that you’re shooting exists on 50 to 100 servers. Then, all the quality dials are wound up and each take is re-rendered out sequentially as the real-time shoot goes on. It allows you to deliver higher resolution background graphics, faster and automatically, to save money and time.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Embrace Change and Dive In

As virtual production is still fairly new for most creatives and directors, there is an element of getting used to new ways of working. Things like lighting are handled differently on a virtual set, for example. When you’ve got real talent lit by hard and soft lighting, and the LED wall with different lighting characteristics displaying the background scene, it all needs to match in order to look like part of the same set viewed from the camera perspective. Fortunately on-set colour grading is about to get a boost, which will be ‘music’ to cinematographers who have already shot in a LED volume.

At the moment, the biggest challenge lies in the quality of the Unreal scene. When you go into a virtual set, there are two types of video that you display on the LED wall. One of them is video plate playback which is used for things like car scenes where the vehicle is moving quickly down a street. The car is static in the virtual studio but the video is moving. Those scenes are very high quality because they are shot with multiple high quality cameras on a rig designed to capture a rolling 360 degree view.

But then you have the Unreal scene using virtual graphics. This is where you need camera tracking on the real camera to match it to the virtual scene displayed on the wall. The quality of these virtual graphics is very good but it’s not quite as good post-production compositing just yet. This is where our NearTime technology can help.

And finally, you’ve got the challenge of continuity when changing elements or editing Unreal scenes live on set. Imagine you’re on a virtual set and suddenly you decide that you want to move one of the objects on the LED volume to the other side of the screen. When you change something, you need to log what you’ve changed as it always has a down-stream impact on the shoot, and it can cause issues if you need to then remember what other scenes need updating as a result. This is something Mo-Sys is working on solving very soon, with technology that allows on-set real-time editing of Unreal scenes that automatically captures and logs the revisions. Watch this space!

Mo-Sys at NAB Show 2022

Mo-Sys had a strong presence at this year’s NAB Show, highlighting our position as a pioneer in driving forward the emerging technologies of virtual and augmented production. During the show we showcased our leading-edge virtual production technology stack and underlined why some of the world’s most innovative content producers rely on our solutions.

Mo-Sys co-exhibited with APG and Fujifilm, integrating our technology with Fujifilm lenses and a state-of-the-art 1.5mm pixel pitch LED wall from APG to demonstrate our solutions in action. See the demonstration below, filmed by KitPlus:

Mo-Sys, Fujifilm and APG Media LED Virtual Production Showcase at NAB 2022

Among the solutions being highlighted:

  • NearTime® – our patent-pending and HPA Engineering Award-winning solution for increasing real-time VFX image quality, whilst removing Moiré completely
  • Multi-Cam Switching – our newly released VP Pro XR feature enabling seamless multi-camera switching to overcome the typical 5-6 frame refresh delay in LED volumes
  • Robotics for virtual production – the new Mo-Sys G30 gyro-stabilized head and the industry-standard Mo-Sys L40 cinematic remote head
  • AR for sports – the new heavy-duty Mo-Sys U50 remote head was shown with Fujifilm’s latest box lens using a Vinten 750i remote head with pan bars
Florian Gallier presents Mo-Sys solution for remote production and sports events

We joined forces with AOTO Electronics Co. Ltd. to showcase solutions aimed at broadcast news and factual programming. Visitors at the AOTO stand were able to see a live LED Virtual Production demonstration featuring VP Pro XR and StarTracker working with AOTO’s 2.3 pitch LED tiles. See the video below:

Mo-Sys and AOTO showcase broadcast LED Virtual Production Solutions

This video features the BBC’s Tokyo 2020 virtual set jointly designed by Lightwell & BK Design Projects with AV integration by Moov.

We illustrated how set extensions can be used to expand the studio space, and how the newly released Multi-Cam Switching feature can be used to seamlessly switch between live cameras at resolutions up to UHD4K, and without the LED wall delay appearing in shot.

In a ground-breaking move, Mo-Sys also showed its integration with Erizos Studio, allowing Unreal Engine graphics to be used not just for the virtual studios, but for traditional on-screen graphics, such as lower thirds and data-driven graphics, either embedded in the Unreal scene or as AR objects.

We also had a presence at the new Vū Technologies studio in Las Vegas, where StarTracker forms a key part of its 140ft. X 20ft. LED volume soundstage.

Around and about at NAB 2022

Tom Shelburne gives an introduction to Mo-Sys VP Pro XR
Jim Rider from Final Pixel talks about the Mo-Sys StarTracker
Carl Bodeker introduces the Vinten FP-188 Robotic Pedestal with Mo-Sys StarTracker
Chris Tornow, CEO of Pfinix Creative Group
Gary Adcock talks about Virtual Production with Mo-Sys
Tim Moore from Vu Technologies talks about the ‘rock solid’ StarTracker

Mo-Sys Highlights The Benefits Of Virtual Production At ISE 2022

Mo-Sys Engineering will highlight its comprehensive Virtual Production technology stack at Integrated Systems Europe (ISE) 2022, where it is co-exhibiting with its partner, European LED screen manufacturer, ALFALITE, and its Spanish reseller, Tangram Solutions.

LED Virtual Production

Visitors to the show will see Mo-Sys flagship StarTracker precision camera and lens tracking technology and VP Pro XR in action. The Mo-Sys StarTracker tracking system is the technology of choice for high-end TV, broadcast and film Virtual Productions. The system will be shown working in tandem with an ALFALITE 4.5m x4m, 1.9mm pixel pitch LED volume.

“As we saw from the recent Tokyo and Beijing games, live broadcast can draw huge benefits from integrating Virtual Production tools and techniques,” said Philippe, Director of Sales and Business Development for Mo-Sys. “Together with ALFALITE we deliver a complete, fully functional system from one supplier, making it easy for customers to implement the technology rapidly and immediately start leveraging the advantages. Our aim is to make extraordinarily complex processes simple and straightforward so that content producers can keep their focus on what is important – being creative.” 

Mo-Sys will present an immersive live demonstration of its end-to-end LED production workflow. The team will demonstrate StarTracker and highlight the benefits of VP Pro XR, Mo-Sys’ award-winning LED content server solutions, for broadcast Virtual Production, such as Multi-Cam, the unique ability to seamlessly switch cameras at full UHD4K resolution. By partnering with ALFALITE, Mo-Sys can now offer broadcasters and production companies ModularPix Pro LED tile modules in a comprehensive package for a complete LED volume, multi-camera production system. 

Luis Garrido Fuentes, Executive Director at ALFALITE added, “Virtual Production can bring real innovation to the table for broadcasters and production companies. Working together with a pioneer such as Mo-Sys delivers enormous value to both our customers, presenting them, with a new and exciting way to create immersive and engaging content that is highly cost effective.”  

Victor Ceruelo, CEO, Tangram Solutions, commented: “There is a real buzz of excitement from broadcasters and production companies in Spain over what can be achieved with Virtual Production techniques, and we are proud to able to offer them access to cutting-edge Virtual Production solutions from Mo-Sys.” 

Visit Mo-Sys, ALFALITE and Tangram Solutions at stand 6K650 during ISE 2022. 

NAB Show Daily – Advances in Production

Over the last 18 months, the boom in virtual production has shaken up the way filmmakers, broadcasters, mobile video designers and more conduct their creative business. Those technologies enabling virtual production include augmented reality (AR), virtual reality (VR), extended reality (XR) and mixed reality (MR). Susan Ashworth of TV Tech reports for the NAB Show Daily Day1 edition.

Green Screen Comes to the Lecture Hall

University of the Netherlands

Virtual reality is enlivening the lecture hall. The University of the Netherlands worked with Zero Density to create a massive virtual lecture hall using the company’s Reality Engine real-time broadcast compositing system.

Using the virtual space, scientists and professors have access to interactive storytelling techniques and advanced visualization methods — complete with real-time realistic reflections and refractions of physical objects and people inside the green screen. Professors are able to explain complex matters on a large virtual screen or run physical experiments with immersive real-time graphics.

The space employs Grass Valley LDX 86N 4K cameras alongside Mo-Sys camera tracking technology while the green screen lecture space itself runs on two Reality Engines.

London Welcomes Virtual Production

Garden Studios, London

When Garden Studios in London finalized its virtual production stage in 2021, the result was a 4,800-square-foot space that could serve as a cost-effective filming option with unique creative opportunities. The new virtual production studio at Garden Studios allows filmmakers to shoot real-time virtual effects on set by using virtual graphics displayed on an LED volume (an enclosed space where motion capture and compositing can take place) to create photo-realistic backdrops.

Among the technologies in place are the VP Pro XR server and StarTracker camera and lens tracking systems from Mo-Sys Engineering. Features within the VP Pro XR are an Unreal Engine editor interface and feature known as Cinematic XR Focus that allows a filmmaker to pull focus between talent or objects in the physical studio and virtual objects positioned in the LED volume.

According to the company, this means that an LED volume can be used as more than just a backdrop but rather can deliver better interactivity between real and virtual elements. Jillian Sanders, virtual production coordinator for Garden Studios, said the Mo-Sys team came to the studio for a multitude of tests including displaying digital tracking markers on the studio’s LED ceiling.

“We’ve also been able to assist with testing and development of their new VP Pro XR,” she said. “This exciting new tool allows for features such as digital set extensions, the ability to focus past the LED wall into the digital world, and near time rendering.”

Read the full article here >

Background to the Lost StarTracker

On 26 August last year, StarTracker (serial no 32-050-417) was packaged and trusted to FedEx to be delivered to Chicago.

The StarTracker never got there and was last tracked by FedEx to Michigan. Mo-Sys had to immediately send a replacement unit in order for the client to start their project on time. At the same time an investigation was started by FedEx that ultimately lead nowhere!

Eight months later on various Facebook virtual production groups, a StarTracker was offered for sale by Jon Morris, which we all assumed at Mo-Sys to be the one lost by Fedex, and therefore to be stolen. As a result, we stated publicly that it would not be supported by Mo-Sys, which effectively stopped anyone from buying it.

We informed the local US police department who then got in touch with Jon, and were told that the goods were officially purchased as part of a bulk lot of lost shipments.

We then found ourselves in a ‘no-win’ situation, with legal action against Fedex the only option, but knowing that if we went down that route that there was a very strong chance that FedEx would probably refer us to small print in their T&Cs indicating that; either they weren’t responsible for shipments failing to reach the intended recipients, or they weren’t responsible for contacting the sender … even with sender’s address displayed both inside and outside the box.

So we reached out to Jon and proposed a way forward that would be mutually beneficial for all parties. The result being that we’ve decided that Mo-Sys will now support Jon in selling the StarTracker.

Whoever purchases this StarTracker can be assured of full Mo-Sys support in setup and calibration. We will also provide free of charge Mo-Sys VP Pro software to complement the StarTracker, enabling it to work in a green screen studio or an LED volume.

Warranty will restart from the moment the new owner receives the StarTracker, and the Mo-Sys remote support team will upgrade it to the latest software and welcome the new owner to our Mo-Sys virtual production family.

The only condition we and Jon have is that the successful bidder makes two transfers; 50% of the winning bid to Jon Morris and the other 50% to Mo-Sys Engineering Ltd. (we’ll supply all bank transfer details).

We think this is the best solution for all concerned. Thanks once again for your help. Respond to Jon on social media groups, or email him directly (jonmorris1023@gmail.com)

Happy bidding!

Strictly Augmented Reality

It is one of the greatest global television hits of the last 15 years. First seen as Strictly Come Dancing, produced by BBC, it is now licensed around the world, as Dancing with the Stars or its equivalent in local languages, and is massively popular.

Strictly Come Dancing Mo-Sys

In the UK, Strictly is the most popular entertainment program on television. For the 2020 season, the producers wanted to add even more glitz and glamour, and were considering upping the graphics game even before the full extent of the Covid-19 pandemic meant the production plan had to be completely rethought.

Graphics specialist company Potion Pictures was already a part of the Strictly production team, building excitement through the screens which are part of the set, and through the dramatic floor projections. Without the usual enthusiastic studio audience, it was clear that the graphics were going to play an even bigger part than usual in building the atmosphere.

Because the programme is broadcast live, there is a very short period between each dance in which to put specific furniture and props in place. Social distancing requirements meant a smaller crew on the studio floor, so physical props had to be cut to an absolute minimum. The producers turned to augmented reality (AR) to make the scenes sparkle.

From steam trains to elephants, the 2020 series of Strictly Come Dancing went all out to provide the visual wow factor and make up for the lack of a live audience. In this case study session for MPTS, Potion Pictures explained how they delivered their most exciting and extravagant graphics to date to bring the series to life using Mo-Sys StarTracker and VP Pro.

Mo-Sys had previously worked with Potion Pictures on a POC (proof-of-concept) in another studio at BBC Elstree. This provided the opportunity to show to the creative and technical team exactly what could be achieved with live real-time AR virtual graphics – and it made them see what really could be possible.

There are layers of complexity in adding AR to Strictly. First, it is primarily a dance competition, so Mo-Sys could not get in the way. The digital objects had to be moved to match the choreography. More important, nothing could obscure the dancers: if the voting audience could not see what the partners were doing, it could have huge consequences for the fairness of the judging.

Second, the show was already full of shine and sparkle, and nothing could affect that. If the video processing to composite real and virtual elements together degraded the live pictures, that too would be unacceptable.

Third, whatever Mo-Sys did had to be in absolute real time, with the minimum of latency. The show depends on music, and the audience would not tolerate pictures out of time with the sound.

The system built for Strictly used the Mo-Sys VP Pro software, designed to integrate directly into Epic Games’ Unreal Engine editor interface. This system provided the real-time compositing and 3D graphics rendering of Potion Pictures’ creative vision.

Camera tracking was provided using Mo-Sys’ StarTracker 6-axis tracking system. The augmented reality images were linked to two of the live cameras: one on a crane, the other on a Steadicam. These are the most challenging camera mounts, which added to the excitement.

The way that the production decided to use AR was to introduce elements at the start and end of the dance. A whole range of virtual graphics was created for the series, from an ice castle and a Barbie house to a train and – bizarrely – an elephant.

For augmented reality to be completely convincing there has to be no visible join between the real and visible worlds. The graphics have to be rooted firmly in position. Strictly was a very testing environment for the Mo-Sys technology; the combination of a dark studio ceiling (where the camera tracking markers are located), rapidly changing lighting, glitterball reflections, meant that this was probably the sternest test of tracking accuracy and integrity possible. It was agreed that there were three key requirements if the virtual graphics illusion was going work and be maintained:

  • ultra-precise camera tracking on all 6-axes of movement
  • extremely accurate lens calibration, so the virtual graphics are distorted to exactly match the camera lens attributes, particularly during zoom shots
  • excellent synchronisation – the camera frequently moves quickly to keep the dancers perfectly framed, and the virtual graphics had to keep up

Mo-Sys delivered on all three. Indeed, the Mo-Sys synchronisation above all else realised this ambition – proving that without this technology the results could not have been achieved.

Mo-Sys Technical Director, James Uren, supervised the system over the three studio days for every episode. That gave the company a remarkable insight into how real, creative production people want to use Mo-Sys equipment, pushing it to the limit. It meant continually refining what could be done, particularly in association with the Unreal engine.

The ice castle saw some incredible refractions enabling the development of ever-better translucency on particle effects. The models had dynamic shadows, again to make them sit naturally into the live studio. This and other continuing improvements were because of the ongoing understanding and recognition of exactly what Strictly was trying to achieve, and interfacing with Epic Games to make even better use of the very latest Unreal Engine features.

Besides the AR graphics, Potion Pictures also used a second type of tracked real-time graphics enhancement. In previous series of Strictly, viewers would have seen the floor displaying a ‘drop-away’ 3D graphic of the top of a building, on which the dancers would appear to perform their routines. However, the camera would never move, and the 3D visual impression was more symbolic than photo-realistic. This series however used an off-axis tracked camera, again utilising the Mo-Sys StarTracker system, meaning that this time the camera could move, and as a result it gave the very realistic impression that the dancers were performing on top of a building.

This series of Strictly produced several key technological ‘firsts’ for virtual production use in a live broadcast:

  • The use of Epic Games’s Unreal Engine natively to produce the real-time graphics
  • The use of Mo-Sys’ VP Pro plug-in software to deliver the real-time compositing and synchronisation
  • The use of Unreal Engine’s image distortion features to show the real studio distorted through ice AR graphics

It is fair to say that AR on Strictly was a learning experience for everyone. But there is no doubt that it added yet more glamour and visual interest to this iconic programme, at a surprisingly low cost. With minimal addition to the programme budget, AR added a huge amount to the visual richness.