Mo-Sys Academy Virtual Production courses announced

Mo-Sys Academy announces new Virtual Production courses and aims to close the skills gap in the virtual production sector as it faces surging demand for trained technicians.

Mo-Sys Academy

Mo-Sys Engineering today announces that it has released a new line-up of guided Virtual Production training courses. This improved extensive programme has been carefully developed and will be delivered by Mo-Sys’ Academy at its London HQ through summer 2022.

With limited availability, demand is expected to be exceptionally high from broadcast and film industry professionals wishing to gain valuable Virtual Production experience, university lecturers upskilling and students alike for what is set to be the most comprehensive practical Virtual Production training on the planet. 

Mo-Sys Academy Virtual Production courses
Mo-Sys Academy

Multiple courses for all levels have been released starting with a 3-day introduction to Virtual Production to an intensive full Virtual Production foundation course over 10-days. Delivered by skilled on-set technicians, summer course dates start from 15th June and run until 15th August 2022. Mo-Sys’ Academy training incorporates the entire Virtual Production spectrum from green screen, augmented reality (AR), computer generated imagery (CGI), motion capture, XR and LED. Learning takes place in a supportive and friendly environment with small group creative exercises throughout. 
 
Course attendees will gain significant access to the latest Virtual Production tools and techniques, including working with the world’s leading camera tracking system, Mo-Sys StarTracker, understanding lighting requirements for green screen and LED production and discovering how to run virtual productions using Unreal Engine as part of a workflow leveraging LED volumes for in-camera visual effects (ICVFX). 
 
Demand for Virtual Production has exploded in recent years and with that, the industry requirement for experienced VP talent has grown in equal measure. Mo-Sys’ Academy has the unrivalled experience and knowledge to guide students to the forefront of the broadcast and film industry.  

“There has been a boom in Virtual Production, and the greatest challenge facing the industry is finding people who understand LED volumes, on-set pre-visualization and XR shooting. These are relatively new techniques and there is a shortage of trained technicians who understand the unique challenges that come with this new and exciting way of creating content,” commented Michael Geissler, CEO of Mo-Sys. “Mo-Sys Academy was created to address the skills bottleneck the industry is facing, and to transfer the knowledge Mo-Sys has gained over the last 25 years.” 

Mo-Sys is also working with universities, such as the University of Sunderland who recently announced a major £1.4m technology refresh. Mo-Sys partner, CJP Broadcast Services, has installed state-of-the-art Virtual Production technology, making Sunderland a powerhouse with standout media courses which will benefit students for years to come. In support of this upgrade to the latest LED volume production technology and tools, Mo-Sys Academy provided Virtual Production training for university staff. 
 
Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland added “This two-week course was brilliant! From the first day to the last it was packed with information and fantastic knowledge delivered by welcoming and friendly tutors in Juliette and Dominic. This was supported by experts who came into our sessions and helped us reach another level of understanding. I cannot recommend this course enough to university departments thinking about installing or who already have Mo-Sys technology. The course takes Virtual Production from theory into practical reality. Before the course, I had no prior experience in Virtual Production and was extremely nervous. After the course, I feel incredibly confident about working in Virtual Production.” 

For more information, please visit Mo-Sys Academy.

Mo-Sys and GMS International ink partnership deal

The two companies will meet the surging demand for Virtual Production technology in South Korea.

Mo-Sys Engineering today announces that it has signed a partnership with South Korean dealer GMS International to make its cutting-edge augmented reality and Virtual Production solutions available to cinematographers and broadcasters in the country. The agreement will help to meet growing demand for Virtual Production in Korea’s dynamic media market, one of the leading early adopters of Virtual Production. 

GMS International has over 20 years’ experience of supporting broadcasters and media organisations in the South Korean market from its headquarters on the outskirts of Seoul. The GMS team has extensive knowledge of working with some of the major names in the Korean media world and delivers consultancy, installation, and system design services. 

Mo-Sys has been a driving force in the use of realtime augmented reality and virtual studios in broadcast, and in the growing use of LED walls and LED volumes for movie production and for live events. Its unrivalled StarTracker system embodies Mo-Sys’ expertise in camera robotics and the company delivers complete end-to-end systems through its VP Pro XR media server.  

“South Korea is recognised as one of the markets driving forward Virtual Production adoption and as a pioneer in this area, Mo-Sys has the tools and technology to meet the rise in demand,” said Michael Geissler, CEO of Mo-Sys. “The combination of GMS’ knowledge of the market and our own expertise of Virtual Production makes for a strong synergy and together we can match the needs of customers to the right tools that allow them to create amazing content in new and exciting ways.” 

The agreement with GMS will see Mo-Sys StarTracker and the VP Pro XR system made available to customers in Korea. Mo-Sys technology integrates seamlessly with major LED volume providers, including ALFALITE and addresses some of the major challenges that arise with live Virtual Production and LED volume integration. 

Hyungjun Kim, CEO of GMS International commented: “Mo-Sys’ unique technology and solutions meet the Virtual Production needs of the high-end content producers here in South Korea better than any other provider, allowing customers to express their creativity with no constraints. We look forward to a successful and fruitful partnership with them”.

Creatives and Directors: What Can You Achieve with Virtual Production?

What you can achieve with Virtual Production – Mo-Sys Engineering’s commercial director Mike Grieve on how virtual production can elevate creativity and save resources

What you can achieve with Virtual Production
What you can achieve with Virtual Production

In the virtual world, possibilities are endless. Unlike real sets where you’re limited to the physical attributes of set design, virtual sets are built in Unreal Engine where you can be anywhere and have anything. Creatively, it breaks you free from budget, time and location limitations.

Stop Fixing in Post, Solve It in Pre

One of the key attributes of virtual production is the ability to pre-visualise everything before you even get to set. You can “walk” through your virtual scene and make decisions on where the best camera angles are, change lens types and adjust lighting. And with everyone from the director and producer, to the cinematographer and VFX supervisor having the ability to be together, looking at the same 3D scene from anywhere in the world, decisions can be made far more quickly and easily. So when you turn up on the day, all you need to do is light and shoot.

You don’t get that level of foresight on a physical shoot. Virtual production swaps basic preparation and fixing things in post, for high level prep by solving things in pre-production.

Not only that, but now that talent can actually see the virtual set around them – using an LED volume – rather than imagining where they need to look and interact using a green screen, you can shoot far more accurately. This helps avoid errors on things like eyelines between talent and virtual elements.

When you look at the whole production process, from pre-production to the actual deliverable, virtual production shrinks the overall production time and costs by reducing post-production needs. The bottom line is, it’s better to solve problems in pre than try to fix them in post.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Shoot In-Camera Effects in Real-Time

The quality of the 3D scene created for a virtual production shoot, is always very, very good. But when the scene is loaded into the computer stack running Unreal Engine, and camera tracking is attached, the scene, more often than not, doesn’t play back in real-time. This is because the scene can’t be processed fast enough.

When this happens, the scene needs to be ‘optimised’, which is a bit like video compression that shrinks down the file size. When the processing load goes down, the frame rate comes up allowing the scene to play back in real-time and the real-time VFX shoot to happen.

The problem then is that the quality level of Unreal scenes is fixed. Because if you try to add any more quality, the frame rate drops below real-time and you can’t shoot in-camera effects. This is a well known problem.

What normally happens is that a director or producer will then need to decide which shots will need to go to post-production for compositing to increase the quality of the background. That takes time and money. But not only that, it actually goes against the whole principle of virtual production which aims to cut down compositing time as much as possible.

At Mo-Sys, we’ve patented a solution to this called Neartime. It’s a service that runs in parallel with a real-time VFX LED shoot, that auto re-renders the background virtual scene at higher quality, enabling it to be composited back together with the keyed talent, so you can deliver a much higher quality product in the same delivery window.

So as soon as you start the camera to do the first shot, all of the tracking and lens data from the camera is thrown up into the Cloud, where that same Unreal scene that you’re shooting exists on 50 to 100 servers. Then, all the quality dials are wound up and each take is re-rendered out sequentially as the real-time shoot goes on. It allows you to deliver higher resolution background graphics, faster and automatically, to save money and time.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Embrace Change and Dive In

As virtual production is still fairly new for most creatives and directors, there is an element of getting used to new ways of working. Things like lighting are handled differently on a virtual set, for example. When you’ve got real talent lit by hard and soft lighting, and the LED wall with different lighting characteristics displaying the background scene, it all needs to match in order to look like part of the same set viewed from the camera perspective. Fortunately on-set colour grading is about to get a boost, which will be ‘music’ to cinematographers who have already shot in a LED volume.

At the moment, the biggest challenge lies in the quality of the Unreal scene. When you go into a virtual set, there are two types of video that you display on the LED wall. One of them is video plate playback which is used for things like car scenes where the vehicle is moving quickly down a street. The car is static in the virtual studio but the video is moving. Those scenes are very high quality because they are shot with multiple high quality cameras on a rig designed to capture a rolling 360 degree view.

But then you have the Unreal scene using virtual graphics. This is where you need camera tracking on the real camera to match it to the virtual scene displayed on the wall. The quality of these virtual graphics is very good but it’s not quite as good post-production compositing just yet. This is where our NearTime technology can help.

And finally, you’ve got the challenge of continuity when changing elements or editing Unreal scenes live on set. Imagine you’re on a virtual set and suddenly you decide that you want to move one of the objects on the LED volume to the other side of the screen. When you change something, you need to log what you’ve changed as it always has a down-stream impact on the shoot, and it can cause issues if you need to then remember what other scenes need updating as a result. This is something Mo-Sys is working on solving very soon, with technology that allows on-set real-time editing of Unreal scenes that automatically captures and logs the revisions. Watch this space!

CJP Commence Technology Refresh for University of Sunderland

CJP Broadcast Service Solutions, a leading provider of products and services for the broadcast and wider television industry, announces the start of a major project for the University of Sunderland.

CJP Mo-Sys
Mo-Sys StarTracker

“This latest venture comprises four elements,” comments CJP Broadcast founder and Managing Director Chris Phillips. “Each element will enable the Faculty of Arts & Creative Industries to teach the very latest techniques in virtual production, TV production and outside broadcasting.

“The first project element will be the provision of a curved LED volume incorporating the latest Mo-Sys LED technology and Bendac InfiLED 1.9 mm LED panels. The faculty will use a Mo-Sys VP Pro XR LED content server to drive the LED volume, featuring seamless Multi-Camera switching, Cinematic XR Focus for pulling focus between real and virtual objects, and Set Extensions. Tracking will be provided by a Mo-Sys StarTracker.

“The second project will be the upgrade of an existing TV studio. It was important for the faculty to upgrade its legacy production infrastructure to full broadcast quality. We recommended and will be implementing a full Ross Video workflow, ready for 4K-UHD expansion.

“Third will be the integration of a motion capture and virtual camera system into the 4K-UHD chromakey studio which we installed last year. This presents an opportunity to incorporate new solutions that further enhance the film and television course modules. It will include areas such as virtual cinematography linked with the Mo-Sys VP Pro Unreal Engine plugin.

“The fourth element of this group will be a mobile production kit based on a StreamStar X4 capable of accommodating up to four camera channels plus a wireless camera link, LiveU LU300 field unit and Hollyland wireless intercom. This will enable students to capture live events on the fly, with integration back to studio.”

“CJP made a great job of the 4K-UHD virtual studio which we commissioned in 2021 for the David Puttnam Media Centre on the Sir Tom Cowie Campus, St Peter’s,” adds Craig Moore, Senior TV Technician. “The CJP team were the logical choice for this next phase in the modernisation of our creative and technical resources. Chromakey has long been one of the most powerful tools available to film and television producers. We are also investing in one of the largest EDU LED stages in the UK. This will incorporate the very latest technology and workflows for virtual production, enabling our students to gain the knowledge and practical experience of new concepts that will become industry standard. The system CJP has recommended and is providing will equip DPMC students with a true state-of-the-art solution which will open creative opportunities limited only by their imagination.”

Kieran Phillips tells us more about this new project below:

“It is important in the creative industries to ensure that our students get access to the technologies that are current and the technological changes that will influence the sector into the future,” states Professor Arabella Plouviez SFHEA, Academic Dean, Faculty of Arts & Creative Industries. “With this further investment in our virtual production studios, we will be able to ensure that our students have hands-on experience and also get to use their creative skills to challenge and push the technology. This investment provides exciting opportunities to bring together students with different skillsets – from TV, film, sound, photography, animation, performance and design as well as students from sciences such as technology and computing.”

“Through virtual production, the converging worlds of film, TV, games, animation and VFX are changing traditional film and television practices,” says Nicholas Glean, University of Sunderland Senior Lecturer in Video & New Media. “The new technological tools and skills needed for virtual production are also challenging traditional film and media production pedagogy. CJP is collaborating with us to navigate and integrate these new skills and tools into our programmes so that we can instruct a new generation of filmmakers. We are happy and excited to be working with them.”

“In addition to the major investment in virtual production, further investment in outside broadcasting equipment, studio cameras and an extensive refit of the vision gallery is fantastic news for our students as it enables them to use industry-standard equipment and learn a variety of new production processes that will place them at the forefront of a number of cutting-edge technologies which are now being used in high-profile productions such as The Mandalorian and the forthcoming series of Doctor Who,” summarises Sue Perryman SFHEA, Senior Lecturer. “This additional investment in outside broadcast technologies also means that our students can work on live, real-time productions, both inside and outside the TV studio, such as music, sport, dance, and performance. These opportunities will further develop students’ creativity as they gain the vital skills needed to work with new state-of-the-art production processes that are revolutionising TV production around the world. I, for one, cannot wait!”

The project is scheduled for completion in Q3 2022.

Mo-Sys demonstrates LED Virtual Production at MPTS 2022 

Mo-Sys Engineering will highlight how its comprehensive Virtual Production technology stack can benefit broadcasters and news providers at the Media & Production Technology Show (MPTS) 2022, where it is co-exhibiting with its partner, CJP Broadcast Service Solutions. Specialising in Virtual Production system integration, CJP Broadcast provides turnkey solutions backed by industry-leading technical support. 

Mo-Sys at MPTS 2022
LED Virtual Production

The Mo-Sys StarTracker camera/lens tracking system, now the technology of choice for leading-edge high-end TV, broadcast, and film Virtual Productions, will be shown working in tandem with Bendac Group’s InfiLED LED displays and LED video processing technology from Brompton Technology

“The benefits that come from integrating Virtual Production tools and techniques into broadcast environments are huge and we are only just scratching the surface,” said Michael Geissler, CEO of Mo-Sys. “We are delighted to join forces with CJP, Bendac and Brompton Technology to show how our solutions can make otherwise complex processes and workflows simple and straightforward. Our focus is to allow broadcasters and production companies to fully express their creativity without constraints so that they can deliver flawless, cinematic quality images to contend hungry audiences.” 

Mo-Sys will demonstrate StarTracker and highlight the specific benefits of VP Pro XR, its award-winning LED content server solution for Virtual Production. The team will be showcasing VP Pro XR’s immersive toolset, such as Cinematic XR Focus, the ability to seamlessly pull focus between talent in the real-world foreground and virtual objects placed behind the physical plane of the LED wall, deep into the scene. Content creators and broadcasters can learn about Mo-Sys’ unique capability to seamlessly switch multiple cameras within Virtual Production at full UHD4K resolution, XR set extensions and Augmented Reality for a virtual world beyond the boundary of the LED wall. 

In addition, the team will be showing Mo-Sys’ ground-breaking integration with Erizos Studio, which enables data-driven graphics and MOS integration in native Unreal Engine. This development means that broadcasters can now use a single platform to deliver all their graphics and utilise high quality LED wall backgrounds instead of a green screen to eliminate green spill issues completely. 
 
With its synergistic integration with LED volume providers, such as Bendac, Mo-Sys gives broadcasters and production companies an elegant all-in-one LED volume, multi-camera production system.

Mo-Sys LED Virtual Production on show at NAB 2022

Mo-Sys LED Virtual Production solutions took centre stage at NAB Show 2022, highlighting Mo-Sys’ position as a pioneer in driving forward the emerging technologies of virtual and augmented production. During the show Mo-Sys showcased its leading-edge virtual production technology stack and underlined why some of the world’s most innovative content producers rely on Mo-Sys solutions.

Mo-Sys co-exhibited with APG and Fujifilm, integrating Mo-Sys technology with Fujifilm lenses and a state-of-the-art 1.5mm pixel pitch LED wall from APG to demonstrate virtual production solutions in action.

Mo-Sys, Fujifilm and APG Media LED Virtual Production Showcase at NAB 2022

Among the solutions being highlighted:

  • NearTime® – patent-pending and HPA Engineering Award-winning solution for increasing real-time VFX image quality, whilst removing Moiré completely
  • Multi-Cam Switching – newly released VP Pro XR feature enabling seamless multi-camera switching to overcome the typical 5-6 frame refresh delay in LED volumes
  • Robotics for virtual production – the new Mo-Sys G30 gyro-stabilized head and the industry-standard Mo-Sys L40 cinematic remote head
  • AR for sports with UE5 – the new heavy-duty Mo-Sys U50 remote head was shown with Fujifilm’s latest box lens using a Vinten 750i remote head with pan bars
Florian Gallier presents Mo-Sys solution for remote production and sports events

Mo-Sys also joined forces with AOTO Electronics Co. Ltd. to showcase solutions for broadcast news and factual programming. Visitors at the AOTO stand were able to see a live LED Virtual Production demonstration featuring VP Pro XR and StarTracker working with AOTO’s 2.3 pitch LED tiles. See the video below:

Mo-Sys and AOTO showcase broadcast LED Virtual Production Solutions

This video features the BBC’s Tokyo 2020 virtual set jointly designed by Lightwell & BK Design Projects with AV integration by Moov.

Mo-Sys illustrated how its set extensions can be used to virtually expand the real-world studio space, and how the newly released Multi-Cam Switching feature can be used to seamlessly switch between live cameras at resolutions up to UHD4K, and without the LED wall delay appearing in shot.

In a ground-breaking move, Mo-Sys also showed its integration with Erizos Studio, allowing Unreal Engine graphics to be used not just for the virtual studios, but for traditional on-screen graphics, such as lower thirds and data-driven graphics, either embedded in the Unreal scene or as AR objects.

Mo-Sys’ VP solutions also had a presence at the new Vū Technologies studio in Las Vegas, where StarTracker forms a key part of its 140ft. X 20ft. LED volume soundstage.

Mo-Sys LED Virtual Production at NAB 2022

Tom Shelburne gives an introduction to Mo-Sys VP Pro XR
Jim Rider from Final Pixel talks about the Mo-Sys StarTracker
Carl Bodeker introduces the Vinten FP-188 Robotic Pedestal with Mo-Sys StarTracker
Chris Tornow, CEO of Pfinix Creative Group explains how dependable Mo-Sys StarTracker is
Gary Adcock talks about Virtual Production with Mo-Sys
Tim Moore from Vu Technologies talks about the ‘rock solid’ StarTracker

Mo-Sys highlights the benefits of Virtual Production at ISE 2022

Mo-Sys Engineering will highlight its comprehensive Virtual Production technology stack at Integrated Systems Europe (ISE) 2022, where it is co-exhibiting with its partner, European LED screen manufacturer, ALFALITE, and its Spanish reseller, Tangram Solutions.

LED Virtual Production

Visitors to the show will see Mo-Sys flagship StarTracker precision camera and lens tracking technology and VP Pro XR in action. The Mo-Sys StarTracker tracking system is the technology of choice for high-end TV, broadcast and film Virtual Productions. The system will be shown working in tandem with an ALFALITE 4.5m x4m, 1.9mm pixel pitch LED volume.

“As we saw from the recent Tokyo and Beijing games, live broadcast can draw huge benefits from integrating Virtual Production tools and techniques,” said Philippe, Director of Sales and Business Development for Mo-Sys. “Together with ALFALITE we deliver a complete, fully functional system from one supplier, making it easy for customers to implement the technology rapidly and immediately start leveraging the advantages. Our aim is to make extraordinarily complex processes simple and straightforward so that content producers can keep their focus on what is important – being creative.” 

Mo-Sys will present an immersive live demonstration of its end-to-end LED production workflow. The team will demonstrate StarTracker and highlight the benefits of VP Pro XR, Mo-Sys’ award-winning LED content server solutions, for broadcast Virtual Production, such as Multi-Cam, the unique ability to seamlessly switch cameras at full UHD4K resolution. By partnering with ALFALITE, Mo-Sys can now offer broadcasters and production companies ModularPix Pro LED tile modules in a comprehensive package for a complete LED volume, multi-camera production system. 

Luis Garrido Fuentes, Executive Director at ALFALITE added, “Virtual Production can bring real innovation to the table for broadcasters and production companies. Working together with a pioneer such as Mo-Sys delivers enormous value to both our customers, presenting them, with a new and exciting way to create immersive and engaging content that is highly cost effective.”  

Victor Ceruelo, CEO, Tangram Solutions, commented: “There is a real buzz of excitement from broadcasters and production companies in Spain over what can be achieved with Virtual Production techniques, and we are proud to able to offer them access to cutting-edge Virtual Production solutions from Mo-Sys.” 

Visit Mo-Sys, ALFALITE and Tangram Solutions at stand 6K650 during ISE 2022.