Mo-Sys Academy Virtual Production courses announced

Mo-Sys Academy announces new Virtual Production courses and aims to close the skills gap in the virtual production sector as it faces surging demand for trained technicians.

Mo-Sys Academy

Mo-Sys Engineering today announces that it has released a new line-up of guided Virtual Production training courses. This improved extensive programme has been carefully developed and will be delivered by Mo-Sys’ Academy at its London HQ through summer 2022.

With limited availability, demand is expected to be exceptionally high from broadcast and film industry professionals wishing to gain valuable Virtual Production experience, university lecturers upskilling and students alike for what is set to be the most comprehensive practical Virtual Production training on the planet. 

Mo-Sys Academy Virtual Production courses
Mo-Sys Academy

Multiple courses for all levels have been released starting with a 3-day introduction to Virtual Production to an intensive full Virtual Production foundation course over 10-days. Delivered by skilled on-set technicians, summer course dates start from 15th June and run until 15th August 2022. Mo-Sys’ Academy training incorporates the entire Virtual Production spectrum from green screen, augmented reality (AR), computer generated imagery (CGI), motion capture, XR and LED. Learning takes place in a supportive and friendly environment with small group creative exercises throughout. 
 
Course attendees will gain significant access to the latest Virtual Production tools and techniques, including working with the world’s leading camera tracking system, Mo-Sys StarTracker, understanding lighting requirements for green screen and LED production and discovering how to run virtual productions using Unreal Engine as part of a workflow leveraging LED volumes for in-camera visual effects (ICVFX). 
 
Demand for Virtual Production has exploded in recent years and with that, the industry requirement for experienced VP talent has grown in equal measure. Mo-Sys’ Academy has the unrivalled experience and knowledge to guide students to the forefront of the broadcast and film industry.  

“There has been a boom in Virtual Production, and the greatest challenge facing the industry is finding people who understand LED volumes, on-set pre-visualization and XR shooting. These are relatively new techniques and there is a shortage of trained technicians who understand the unique challenges that come with this new and exciting way of creating content,” commented Michael Geissler, CEO of Mo-Sys. “Mo-Sys Academy was created to address the skills bottleneck the industry is facing, and to transfer the knowledge Mo-Sys has gained over the last 25 years.” 

Mo-Sys is also working with universities, such as the University of Sunderland who recently announced a major £1.4m technology refresh. Mo-Sys partner, CJP Broadcast Services, has installed state-of-the-art Virtual Production technology, making Sunderland a powerhouse with standout media courses which will benefit students for years to come. In support of this upgrade to the latest LED volume production technology and tools, Mo-Sys Academy provided Virtual Production training for university staff. 
 
Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland added “This two-week course was brilliant! From the first day to the last it was packed with information and fantastic knowledge delivered by welcoming and friendly tutors in Juliette and Dominic. This was supported by experts who came into our sessions and helped us reach another level of understanding. I cannot recommend this course enough to university departments thinking about installing or who already have Mo-Sys technology. The course takes Virtual Production from theory into practical reality. Before the course, I had no prior experience in Virtual Production and was extremely nervous. After the course, I feel incredibly confident about working in Virtual Production.” 

For more information, please visit Mo-Sys Academy.

Mo-Sys and GMS International ink partnership deal

The two companies will meet the surging demand for Virtual Production technology in South Korea.

Mo-Sys Engineering today announces that it has signed a partnership with South Korean dealer GMS International to make its cutting-edge augmented reality and Virtual Production solutions available to cinematographers and broadcasters in the country. The agreement will help to meet growing demand for Virtual Production in Korea’s dynamic media market, one of the leading early adopters of Virtual Production. 

GMS International has over 20 years’ experience of supporting broadcasters and media organisations in the South Korean market from its headquarters on the outskirts of Seoul. The GMS team has extensive knowledge of working with some of the major names in the Korean media world and delivers consultancy, installation, and system design services. 

Mo-Sys has been a driving force in the use of realtime augmented reality and virtual studios in broadcast, and in the growing use of LED walls and LED volumes for movie production and for live events. Its unrivalled StarTracker system embodies Mo-Sys’ expertise in camera robotics and the company delivers complete end-to-end systems through its VP Pro XR media server.  

“South Korea is recognised as one of the markets driving forward Virtual Production adoption and as a pioneer in this area, Mo-Sys has the tools and technology to meet the rise in demand,” said Michael Geissler, CEO of Mo-Sys. “The combination of GMS’ knowledge of the market and our own expertise of Virtual Production makes for a strong synergy and together we can match the needs of customers to the right tools that allow them to create amazing content in new and exciting ways.” 

The agreement with GMS will see Mo-Sys StarTracker and the VP Pro XR system made available to customers in Korea. Mo-Sys technology integrates seamlessly with major LED volume providers, including ALFALITE and addresses some of the major challenges that arise with live Virtual Production and LED volume integration. 

Hyungjun Kim, CEO of GMS International commented: “Mo-Sys’ unique technology and solutions meet the Virtual Production needs of the high-end content producers here in South Korea better than any other provider, allowing customers to express their creativity with no constraints. We look forward to a successful and fruitful partnership with them”.

Creatives and Directors: What Can You Achieve Creatively in Virtual Production?

Mo-Sys Engineering’s commercial director Mike Grieve on how virtual production can elevate creativity and save resources

In the virtual world, possibilities are endless. Unlike real sets where you’re limited to the physical attributes of set design, virtual sets are built in Unreal Engine where you can be anywhere and have anything. Creatively, it breaks you free from budget, time and location limitations.

Stop Fixing in Post, Solve It in Pre

One of the key attributes of virtual production is the ability to pre-visualise everything before you even get to set. You can “walk” through your virtual scene and make decisions on where the best camera angles are, change lens types and adjust lighting. And with everyone from the director and producer, to the cinematographer and VFX supervisor having the ability to be together, looking at the same 3D scene from anywhere in the world, decisions can be made far more quickly and easily. So when you turn up on the day, all you need to do is light and shoot.

You don’t get that level of foresight on a physical shoot. Virtual production swaps basic preparation and fixing things in post, for high level prep by solving things in pre-production.

Not only that, but now that talent can actually see the virtual set around them – using an LED volume – rather than imagining where they need to look and interact using a green screen, you can shoot far more accurately. This helps avoid errors on things like eyelines between talent and virtual elements.

When you look at the whole production process, from pre-production to the actual deliverable, virtual production shrinks the overall production time and costs by reducing post-production needs. The bottom line is, it’s better to solve problems in pre than try to fix them in post.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Shoot In-Camera Effects in Real-Time

The quality of the 3D scene created for a virtual production shoot, is always very, very good. But when the scene is loaded into the computer stack running Unreal Engine, and camera tracking is attached, the scene, more often than not, doesn’t play back in real-time. This is because the scene can’t be processed fast enough.

When this happens, the scene needs to be ‘optimised’, which is a bit like video compression that shrinks down the file size. When the processing load goes down, the frame rate comes up allowing the scene to play back in real-time and the real-time VFX shoot to happen.

The problem then is that the quality level of Unreal scenes is fixed. Because if you try to add any more quality, the frame rate drops below real-time and you can’t shoot in-camera effects. This is a well known problem.

What normally happens is that a director or producer will then need to decide which shots will need to go to post-production for compositing to increase the quality of the background. That takes time and money. But not only that, it actually goes against the whole principle of virtual production which aims to cut down compositing time as much as possible.

At Mo-Sys, we’ve patented a solution to this called Neartime. It’s a service that runs in parallel with a real-time VFX LED shoot, that auto re-renders the background virtual scene at higher quality, enabling it to be composited back together with the keyed talent, so you can deliver a much higher quality product in the same delivery window.

So as soon as you start the camera to do the first shot, all of the tracking and lens data from the camera is thrown up into the Cloud, where that same Unreal scene that you’re shooting exists on 50 to 100 servers. Then, all the quality dials are wound up and each take is re-rendered out sequentially as the real-time shoot goes on. It allows you to deliver higher resolution background graphics, faster and automatically, to save money and time.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Embrace Change and Dive In

As virtual production is still fairly new for most creatives and directors, there is an element of getting used to new ways of working. Things like lighting are handled differently on a virtual set, for example. When you’ve got real talent lit by hard and soft lighting, and the LED wall with different lighting characteristics displaying the background scene, it all needs to match in order to look like part of the same set viewed from the camera perspective. Fortunately on-set colour grading is about to get a boost, which will be ‘music’ to cinematographers who have already shot in a LED volume.

At the moment, the biggest challenge lies in the quality of the Unreal scene. When you go into a virtual set, there are two types of video that you display on the LED wall. One of them is video plate playback which is used for things like car scenes where the vehicle is moving quickly down a street. The car is static in the virtual studio but the video is moving. Those scenes are very high quality because they are shot with multiple high quality cameras on a rig designed to capture a rolling 360 degree view.

But then you have the Unreal scene using virtual graphics. This is where you need camera tracking on the real camera to match it to the virtual scene displayed on the wall. The quality of these virtual graphics is very good but it’s not quite as good post-production compositing just yet. This is where our NearTime technology can help.

And finally, you’ve got the challenge of continuity when changing elements or editing Unreal scenes live on set. Imagine you’re on a virtual set and suddenly you decide that you want to move one of the objects on the LED volume to the other side of the screen. When you change something, you need to log what you’ve changed as it always has a down-stream impact on the shoot, and it can cause issues if you need to then remember what other scenes need updating as a result. This is something Mo-Sys is working on solving very soon, with technology that allows on-set real-time editing of Unreal scenes that automatically captures and logs the revisions. Watch this space!

CJP Commences Technology Refresh for University of Sunderland Media School

CJP Broadcast Service Solutions, a leading provider of products and services for the broadcast and wider television industry, announces the start of a major project for the University of Sunderland.

CJP Mo-Sys
Mo-Sys StarTracker

“This latest venture comprises four elements,” comments CJP Broadcast founder and Managing Director Chris Phillips. “Each element will enable the Faculty of Arts & Creative Industries to teach the very latest techniques in virtual production, TV production and outside broadcasting.

“The first project element will be the provision of a curved LED volume incorporating the latest Mo-Sys LED technology and Bendac InfiLED 1.9 mm LED panels. The faculty will use a Mo-Sys VP Pro XR LED content server to drive the LED volume, featuring seamless Multi-Camera switching, Cinematic XR Focus for pulling focus between real and virtual objects, and Set Extensions. Tracking will be provided by a Mo-Sys StarTracker.

“The second project will be the upgrade of an existing TV studio. It was important for the faculty to upgrade its legacy production infrastructure to full broadcast quality. We recommended and will be implementing a full Ross Video workflow, ready for 4K-UHD expansion.

“Third will be the integration of a motion capture and virtual camera system into the 4K-UHD chromakey studio which we installed last year. This presents an opportunity to incorporate new solutions that further enhance the film and television course modules. It will include areas such as virtual cinematography linked with the Mo-Sys VP Pro Unreal Engine plugin.

“The fourth element of this group will be a mobile production kit based on a StreamStar X4 capable of accommodating up to four camera channels plus a wireless camera link, LiveU LU300 field unit and Hollyland wireless intercom. This will enable students to capture live events on the fly, with integration back to studio.”

“CJP made a great job of the 4K-UHD virtual studio which we commissioned in 2021 for the David Puttnam Media Centre on the Sir Tom Cowie Campus, St Peter’s,” adds Craig Moore, Senior TV Technician. “The CJP team were the logical choice for this next phase in the modernisation of our creative and technical resources. Chromakey has long been one of the most powerful tools available to film and television producers. We are also investing in one of the largest EDU LED stages in the UK. This will incorporate the very latest technology and workflows for virtual production, enabling our students to gain the knowledge and practical experience of new concepts that will become industry standard. The system CJP has recommended and is providing will equip DPMC students with a true state-of-the-art solution which will open creative opportunities limited only by their imagination.”

Kieran Phillips tells us more about this new project below:

“It is important in the creative industries to ensure that our students get access to the technologies that are current and the technological changes that will influence the sector into the future,” states Professor Arabella Plouviez SFHEA, Academic Dean, Faculty of Arts & Creative Industries. “With this further investment in our virtual production studios, we will be able to ensure that our students have hands-on experience and also get to use their creative skills to challenge and push the technology. This investment provides exciting opportunities to bring together students with different skillsets – from TV, film, sound, photography, animation, performance and design as well as students from sciences such as technology and computing.”

“Through virtual production, the converging worlds of film, TV, games, animation and VFX are changing traditional film and television practices,” says Nicholas Glean, University of Sunderland Senior Lecturer in Video & New Media. “The new technological tools and skills needed for virtual production are also challenging traditional film and media production pedagogy. CJP is collaborating with us to navigate and integrate these new skills and tools into our programmes so that we can instruct a new generation of filmmakers. We are happy and excited to be working with them.”

“In addition to the major investment in virtual production, further investment in outside broadcasting equipment, studio cameras and an extensive refit of the vision gallery is fantastic news for our students as it enables them to use industry-standard equipment and learn a variety of new production processes that will place them at the forefront of a number of cutting-edge technologies which are now being used in high-profile productions such as The Mandalorian and the forthcoming series of Doctor Who,” summarises Sue Perryman SFHEA, Senior Lecturer. “This additional investment in outside broadcast technologies also means that our students can work on live, real-time productions, both inside and outside the TV studio, such as music, sport, dance, and performance. These opportunities will further develop students’ creativity as they gain the vital skills needed to work with new state-of-the-art production processes that are revolutionising TV production around the world. I, for one, cannot wait!”

The project is scheduled for completion in Q3 2022.

Mo-Sys demonstrates LED Virtual Production at MPTS 2022 

Mo-Sys Engineering will highlight how its comprehensive Virtual Production technology stack can benefit broadcasters and news providers at the Media & Production Technology Show (MPTS) 2022, where it is co-exhibiting with its partner, CJP Broadcast Service Solutions. Specialising in Virtual Production system integration, CJP Broadcast provides turnkey solutions backed by industry-leading technical support. 

Mo-Sys at MPTS 2022
LED Virtual Production

The Mo-Sys StarTracker camera/lens tracking system, now the technology of choice for leading-edge high-end TV, broadcast, and film Virtual Productions, will be shown working in tandem with Bendac Group’s InfiLED LED displays and LED video processing technology from Brompton Technology

“The benefits that come from integrating Virtual Production tools and techniques into broadcast environments are huge and we are only just scratching the surface,” said Michael Geissler, CEO of Mo-Sys. “We are delighted to join forces with CJP, Bendac and Brompton Technology to show how our solutions can make otherwise complex processes and workflows simple and straightforward. Our focus is to allow broadcasters and production companies to fully express their creativity without constraints so that they can deliver flawless, cinematic quality images to contend hungry audiences.” 

Mo-Sys will demonstrate StarTracker and highlight the specific benefits of VP Pro XR, its award-winning LED content server solution for Virtual Production. The team will be showcasing VP Pro XR’s immersive toolset, such as Cinematic XR Focus, the ability to seamlessly pull focus between talent in the real-world foreground and virtual objects placed behind the physical plane of the LED wall, deep into the scene. Content creators and broadcasters can learn about Mo-Sys’ unique capability to seamlessly switch multiple cameras within Virtual Production at full UHD4K resolution, XR set extensions and Augmented Reality for a virtual world beyond the boundary of the LED wall. 

In addition, the team will be showing Mo-Sys’ ground-breaking integration with Erizos Studio, which enables data-driven graphics and MOS integration in native Unreal Engine. This development means that broadcasters can now use a single platform to deliver all their graphics and utilise high quality LED wall backgrounds instead of a green screen to eliminate green spill issues completely. 
 
With its synergistic integration with LED volume providers, such as Bendac, Mo-Sys gives broadcasters and production companies an elegant all-in-one LED volume, multi-camera production system.

Mo-Sys at NAB Show 2022

Mo-Sys had a strong presence at this year’s NAB Show, highlighting our position as a pioneer in driving forward the emerging technologies of virtual and augmented production. During the show we showcased our leading-edge virtual production technology stack and underlined why some of the world’s most innovative content producers rely on our solutions.

Mo-Sys co-exhibited with APG and Fujifilm, integrating our technology with Fujifilm lenses and a state-of-the-art 1.5mm pixel pitch LED wall from APG to demonstrate our solutions in action. See the demonstration below, filmed by KitPlus:

Mo-Sys, Fujifilm and APG Media LED Virtual Production Showcase at NAB 2022

Among the solutions being highlighted:

  • NearTime® – our patent-pending and HPA Engineering Award-winning solution for increasing real-time VFX image quality, whilst removing Moiré completely
  • Multi-Cam Switching – our newly released VP Pro XR feature enabling seamless multi-camera switching to overcome the typical 5-6 frame refresh delay in LED volumes
  • Robotics for virtual production – the new Mo-Sys G30 gyro-stabilized head and the industry-standard Mo-Sys L40 cinematic remote head
  • AR for sports – the new heavy-duty Mo-Sys U50 remote head was shown with Fujifilm’s latest box lens using a Vinten 750i remote head with pan bars
Florian Gallier presents Mo-Sys solution for remote production and sports events

We joined forces with AOTO Electronics Co. Ltd. to showcase solutions aimed at broadcast news and factual programming. Visitors at the AOTO stand were able to see a live LED Virtual Production demonstration featuring VP Pro XR and StarTracker working with AOTO’s 2.3 pitch LED tiles. See the video below:

Mo-Sys and AOTO showcase broadcast LED Virtual Production Solutions

This video features the BBC’s Tokyo 2020 virtual set jointly designed by Lightwell & BK Design Projects with AV integration by Moov.

We illustrated how set extensions can be used to expand the studio space, and how the newly released Multi-Cam Switching feature can be used to seamlessly switch between live cameras at resolutions up to UHD4K, and without the LED wall delay appearing in shot.

In a ground-breaking move, Mo-Sys also showed its integration with Erizos Studio, allowing Unreal Engine graphics to be used not just for the virtual studios, but for traditional on-screen graphics, such as lower thirds and data-driven graphics, either embedded in the Unreal scene or as AR objects.

We also had a presence at the new Vū Technologies studio in Las Vegas, where StarTracker forms a key part of its 140ft. X 20ft. LED volume soundstage.

Around and about at NAB 2022

Tom Shelburne gives an introduction to Mo-Sys VP Pro XR
Jim Rider from Final Pixel talks about the Mo-Sys StarTracker
Carl Bodeker introduces the Vinten FP-188 Robotic Pedestal with Mo-Sys StarTracker
Chris Tornow, CEO of Pfinix Creative Group
Gary Adcock talks about Virtual Production with Mo-Sys
Tim Moore from Vu Technologies talks about the ‘rock solid’ StarTracker

Mo-Sys Highlights The Benefits Of Virtual Production At ISE 2022

Mo-Sys Engineering will highlight its comprehensive Virtual Production technology stack at Integrated Systems Europe (ISE) 2022, where it is co-exhibiting with its partner, European LED screen manufacturer, ALFALITE, and its Spanish reseller, Tangram Solutions.

LED Virtual Production

Visitors to the show will see Mo-Sys flagship StarTracker precision camera and lens tracking technology and VP Pro XR in action. The Mo-Sys StarTracker tracking system is the technology of choice for high-end TV, broadcast and film Virtual Productions. The system will be shown working in tandem with an ALFALITE 4.5m x4m, 1.9mm pixel pitch LED volume.

“As we saw from the recent Tokyo and Beijing games, live broadcast can draw huge benefits from integrating Virtual Production tools and techniques,” said Philippe, Director of Sales and Business Development for Mo-Sys. “Together with ALFALITE we deliver a complete, fully functional system from one supplier, making it easy for customers to implement the technology rapidly and immediately start leveraging the advantages. Our aim is to make extraordinarily complex processes simple and straightforward so that content producers can keep their focus on what is important – being creative.” 

Mo-Sys will present an immersive live demonstration of its end-to-end LED production workflow. The team will demonstrate StarTracker and highlight the benefits of VP Pro XR, Mo-Sys’ award-winning LED content server solutions, for broadcast Virtual Production, such as Multi-Cam, the unique ability to seamlessly switch cameras at full UHD4K resolution. By partnering with ALFALITE, Mo-Sys can now offer broadcasters and production companies ModularPix Pro LED tile modules in a comprehensive package for a complete LED volume, multi-camera production system. 

Luis Garrido Fuentes, Executive Director at ALFALITE added, “Virtual Production can bring real innovation to the table for broadcasters and production companies. Working together with a pioneer such as Mo-Sys delivers enormous value to both our customers, presenting them, with a new and exciting way to create immersive and engaging content that is highly cost effective.”  

Victor Ceruelo, CEO, Tangram Solutions, commented: “There is a real buzz of excitement from broadcasters and production companies in Spain over what can be achieved with Virtual Production techniques, and we are proud to able to offer them access to cutting-edge Virtual Production solutions from Mo-Sys.” 

Visit Mo-Sys, ALFALITE and Tangram Solutions at stand 6K650 during ISE 2022. 

PLAZAMEDIA selects Mo-Sys as its primary XR solutions provider

 Mo-Sys Engineering today announces that its long-standing relationship with PLAZAMEDIA GmbH, a subsidiary of Sport1 Medien AG and an established content solution provider for the entire spectrum of media platforms and one of the leading producers of sports TV for German-speaking audiences, has been extended. The content solutions provider, which aims to raise the bar for Virtual Production with its new LED Studio, has chosen Mo-Sys as the primary XR solution provider for the implementation of its LED initiatives.

At the heart of PLAZAMEDIA’s decision is the unmatched capability and functionality of Mo-Sys’ technology and features such as multi-camera switching, Cinematic XR Focus and its latest innovation, the NearTime® on-set re-rendering workflow for ICVFX.

Jens Friedrichs, Chairman of the Management Board of PLAZAMEDIA GmbH commented: “Mo-Sys delivers all the performance we need to create a leading-edge LED volume studio. They understand the importance of delivering cinematic quality from end-to-end even for broadcast applications and productions for corporate clients – especially with regard to our clear focus on sustainable green production. This combined with their innovative toolset and the collaborative approach of the Mo-Sys team made them the unbeatable choice of partner for us.”

The flagship Mo-Sys StarTracker precision camera/lens tracking system is now the technology of choice for leading-edge Virtual Productions. The advanced tools in Mo-Sys’ VP Pro XR content server include set extensions, color-grading, and the unique ability to pull focus seamlessly between real and virtual objects made possible by the unique Cinematic XR Focus feature.

NearTime® is Mo-Sys’ patent-pending and award-winning solution for solving real-time VFX Virtual Production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, and delivers higher quality real-time VFX content. NearTime® also removes Moiré patterning completely and enables the use of lower cost LED panels to deliver an image quality that is far closer to post-production compositing.

“As pioneers in Virtual Production technology, we are driven by our passion to create tools that help our customers create immersive and engaging content without limiting their creativity,” said Philippe Vignal, Mo-Sys Director of Sales and Business Development, EMEA & APAC. “We are extremely proud that an innovator like PLAZAMEDIA has chosen to place Mo-Sys technology at the heart of its new Virtual Production capabilities.”

Mo-Sys and Ideal Systems Group partner for pan-Asian excellence

Mo-Sys Engineering is working with Ideal Systems to drive forward the adoption of virtual and augmented reality production in broadcast across Asia and the Middle East.

Ideal Systems has been a leader in media solutions for more than 30 years. From its head office in Hong Kong, it brings a wealth of experience together with a huge presence in 10 countries and a proven reach across the whole region. It provides consultancy, design, integration, installation and continuing support for many major names in broadcasting, media production and technology.

Ideal Systems

Mo-Sys has driven forward the emerging technologies of virtual and augmented production for 25 years. Pioneers in camera tracking with its unrivalled StarTracker system, and in camera robotics, the company can now deliver turnkey systems through its VP Pro graphics systems. Mo-Sys has been instrumental in the extensive use of realtime augmented reality and virtual studios in broadcast, and in the growing use of LED walls and LED volumes for movie production and for live events.

“The production of outstanding content is a global business,” said Philippe Vignal, director of sales and business development for Mo-Sys. “It is vital for us that we work with the best possible sales partners, to ensure our ground-breaking innovations are seen by all the key players. Ideal Systems is the perfect partner for us, as they have the broad reach across broadcast throughout EMEA and Asia.”

Jim Butler CEO of Ideal Systems added “Mo-Sys is a very impressive company. Its camera tracking and virtual production technologies are recognised by the biggest names out there, and they have very interesting cutting-edge developments in image robotics and remote production. The whole team is excited to be working with Mo-Sys and to be able to offer our customers new revolutionary solutions.”

Broadcasters and content producers can visit Mo-Sys at CABSAT (17 – 19 May, Dubai World Trade Centre, stand E6-12B). Mo-Sys will have a rich demonstration of its latest VP technologies, featuring StarTracker, its high precision optical camera tracking system, and VP Pro XR, its cinematic content server solution for LED production, along with the Mo-Sys VP Pro integrated augmented reality production system on the Ideal Systems stand A5-1.

Virtual Production and the importance of having the right tools for the job

“For decades we have been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalize the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process.” Michael Geissler, CEO of Mo-Sys, talks to InBroadcast about the evolution of Virtual Production.

Mo-Sys InBroadcast
CLICK HERE TO READ THE INBROADCAST ARTICLE PG 34.

Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient, but requires a greater level of pre-production preparation, virtual production knowledge and experience, together with the right tools for the job. 

The concept of Virtual Production is not new. Tracked virtual graphics arrived during the mid-1990’s where they were used predominantly for broadcast, albeit with limited tracking, no lens calibration and would typically produce poor virtual image quality.

Thankfully, the story doesn’t end there. In what Mo-Sys have named Virtual Production 1.0 (VP 1.0), circa. 2016 we saw the introduction of games engines capable of producing photo-realistic onset pre-viz. The next logical step would have been real-time VFX in green/blue screen studios, but we jumped straight to VP 2.0, where production of The Mandalorian illustrated the exciting new capabilities of real-time VFX using LED volumes.

Hype, coupled with a surge in demand from the locked-down live events industry, who cleverly pivoted into virtual production for survival, meant that many early LED VP stages were using lower quality live event LED tiles, live event-oriented media servers and mo-cap tracking systems, rather than cinematic LED tiles, dedicated cinematic content servers, and precision camera tracking systems such as Mo-Sys StarTracker.

Right now, virtual production is in its third stage of evolution – VP 3.0. It has matured over the last five years and in 2020, the global market was reported to be worth in excess of £2 billion and is expected to grow beyond £5 billion by 2028. Mo-Sys’ experience gained from 25 years within the broadcast and film industry, together with bleeding edge innovation is driving VP from what it was in early 2020, to what it needs to be in order to deliver on many of the benefits originally promised.

Within broadcast and film, there is now a greater understanding surrounding the limitations of live event LED tiles and events media servers which were developed for entertainment applications such as projection mapping and timeline lighting shows.

Cinematographers and VFX Supervisors are demanding a dedicated, focused toolset that meets their needs and enables a closer content quality match to traditional post-production compositing quality.

Mo-Sys VP Pro XR

Mo-Sys is delivering exactly that with its award-winning VP Pro XR. Designed by in-house cinematic innovators for XR creatives, Mo-Sys VP Pro XR is an award-winning, dedicated XR server solution that takes a radical new approach to delivering cinematic standards to on-set, real-time virtual production using LED volumes. Designed for LED stages with or without set extensions, it can also be used with blue/green screens and enables traditional shooting techniques within an LED volume, with a focus on composite image quality.

Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. A pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions, Mo-Sys has made it even easier to produce seamless, high-end productions with VP Pro XR. Designed to enable the use of traditional shooting techniques within virtual productions, it removes the limitations on the ability to tell stories that are imposed by current XR stage designs.

Multi-Cam
Mo-Sys Multi-Cam Switching

Another major hurdle in VP and the use of LED volume comes when switching between multiple cameras pointing at an LED volume. Whilst the camera outputs can be switched in 1 frame, the virtual scene displayed on the LED volume typically take 5-6 frames to update. This means on every camera switch there will be 5-6 frames of the previous camera’s virtual scene displayed on the LED volume, before it updates with the correct perspective view for the new camera. As a result, you get a background flash on every camera switch, which is unusable in production.

The latest iteration of VP Pro XR addresses this and orchestrates the delay between the operator switching cameras, and the LED volume updating with the correct perspective view of the live camera. More importantly, it will do this at up to the full UHD4K resolution of the LED processor input, whereas previous workarounds would reduce the resolution of the content to HD to achieve the same outcome.

Mo-Sys Cinematic XR Focus

Along with full resolution multi-cam switching, Mo-Sys Cinematic XR Focus enables seamless interaction between virtual and real worlds. This feature ensures that an LED wall can be used as more than just a backdrop, allowing it to integrate with the real stage. This gives cinematographers the means to seamlessly rack focus deep into the virtual world and create immersive shots which enhance their visual storytelling.

NearTime® is Mo-Sys’ HPA Engineering award-winning solution for solving real-time VFX virtual production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, when combined deliver higher quality real-time VFX content.

Read the full article here (Pg 34) >