Mo-Sys Academy Students Set for Promising Careers

Mo-Sys Academy students set for promising careers as short film wins multiple awards

Three talented Mo-Sys Academy students look set for promising careers as their short film, Balance wins multiple awards at the University of Greenwich’ BAFTA styled film and television GRAFTAs.

Mo-Sys Academy invited students from universities to attend the Virtual Production Practical Summer School 2021, an intensive six-week course designed to introduce students to virtual production through hands-on practical learning with the latest technology.

Set on the Mars Desert Research Station in Utah, Balance follows the journey of trainee astronaut, Ben who is working to complete training and follow in his father’s footsteps. The production used a mix of traditional and virtual production with the team utilising Mo-Sys Academy studios to film the virtual scenes in just two days.

Mo-Sys Academy students set for promising careers

Mo-Sys Academy students set for promising careers as Balance wins multiple GRAFTA Awards from the University of Greenwich

I massively enjoyed working on Balance since I was able to explore the process of virtual production in detail with a crew I trusted. As a cinematographer there are a lot of new things you may need to know, to prepare yourself for the shoot. Virtual production allows you to set up a scene in the virtual environment even before getting to the studio. All that’s left during the production process is trying to imitate that light in front of the green screen. The experience I gained during the Mo-Sys Academy training was crucial when working on Balance. When making the shot list I already knew what things would look like and what scenes would be more complicated to film than the others in that way saving time on the set.

Director of Photography, Emils Lisovskis

Led by Juliette Thymi, a senior VP Technician and experienced Virtual Production Producer who has worked on projects for Netflix, ITV and BBC. Mo-Sys’ Academy aims to build students confidence and provide valuable experience, priding itself on a friendly, collaborative learning environment for all skill levels.

During their time with Mo-Sys Academy the team worked through a practical exercise set on a desert alien planet and this would become the forerunner to ‘Balance’. Judged by a panel of recognised industry professionals, Balance picked up nominations for all categories, winning four of them; Best Director, Best Producer, Best Production Design and Best Sound Design.

We are incredibly proud to see three former Academy students following a path of Virtual Production and winning awards so quickly. I am confident the team have fantastic careers ahead of them.

Juliette Thymi, Mo-Sys Academy

Since attending the Academy, the team have stayed in touch, not just for ‘Balance’ but bringing more ideas and collaborating with Mo-Sys on projects like the TEDx University of Greenwich VP event.

Mo-Sys Academy Students collect GRAFTA Awards for Balance
Mo-Sys Academy Students collect GRAFTA Awards for Balance

When I got accepted at Mo-Sys Academy 2021, I knew there and then that I would use Virtual Production to create my film. In the 2 weeks course I learned all I could so that I would be able to tell a story that could take place in an exotic environment, furthermore, the short film we’ve done at the end of the Academy served as a great proof of concept to how Balance could turn out. 

We had little experience in VP apart from the Mo-Sys Academy, however when I created the story I had I mind the fact that I would like to mix traditional filmmaking with virtual production, so I could benefit from the best of both worlds. I planned on using VP to recreate the desert, for the interiors I planned on using real locations. This created the basis and offered us familiar territory to play with considering that in the Academy exercise our story took place on a desert alien planet, basically using the exercise and what we learned in our favour so that this time we could make bigger and better.

It was an overwhelming feeling finding out that our film was nominated for all categories at the GRAFTA awards. I have to give credit where credit is due, I wouldn’t have been able to achieve the film Balance or produce the TEDx University of Greenwich VP event without the help and knowledge of my crew members Emils Lisovskis and Eduard Fadgyas. Also, none of these projects would have been possible without the teachings from the Mo-Sys Academy, led by Juliette Thymi and Dom Smith. Thanks guys!

Director/VP Supervisor & Producer, Jean Ciuntu

Mo-Sys Academy has announced new course dates in the UK and Los Angeles. Spaces are limited and demand is expected to be high. Visit Mo-Sys Academy for more information and to book your place.

Mo-Sys Hosts Regional Virtual Production Learning Zone at MediaCity

Mo-Sys ran the successful Virtual Production Learning Zone at KitPlus Show, MediaCity, on Thursday 23 June 2022. The event underscoring Mo-Sys’ deep commitment to share knowledge, collaborate with universities and training the next generation of industry professionals.

Led by Juliette Thymi, Mo-Sys Academy’s senior VP Technician and experienced Virtual Production Producer who has worked on projects for Netflix, ITV and BBC, such as Strictly Come Dancing, the free taster sessions gave visitors insight as to the technology and techniques used in Virtual Production, while highlighting a proven development path for those who’d like to learn more.

The full range of innovative Mo-Sys VP solutions were on display, including StarTracker, the industry standard for precision camera tracking; VP Pro XR, a dedicated XR server that has been built specifically for Cinematic XR on-set real-time production; the unique Cinematic XR Focus feature that enables pulling focus from talent to virtual objects deep within an LED volume; and Multi-Cam for seamless multi-camera switching.

Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland recently completed the Mo-Sys Academy 10-day Foundation Course and commented on the importance of engagement between industry and the education sector:

Nicholas Glean – University of Sunderland – The importance of Virtual Production

Mo-Sys is actively driving engagement with universities such as Sunderland, Greenwich and Ravensbourne to ensure we have the right skills coming through to meet the surging demand for VP, but also offer the next generation access to exciting career opportunities.

Mo-Sys Academy with Virtual Production Training at KitPlus Show MediaCity 2022

During the show, Mo-Sys hosted a panel discussion titled The Future of Virtual Production Training, featuring Kieran Phillips of CJP Broadcast Service Solutions, Nicholas Glean from University of Sunderland, and Adam Soan of Bendac. The session provided insight into overcoming the skills gap and maximizing the opportunities of LED Virtual Production for Broadcast. Watch the seminar in full below:

The Future of Virtual Production Training – KitPlus Show MediaCity 2022

Alistair Davidson and a team from Scan Computers attended the workshop and said it was “a really fantastic insight for anyone whose interested in Virtual Production.”

Scan Computers talks about the Virtual Production training workshop at KitPlus Show

Rizwan Wadan from Pixeleyed Pictures summarised the KitPlus event adding: “These sort of events are amazing. They help you understand what is going on and I’d highly recommend that whether you’re a student, lecturer or professional to attend.”

Rizwan Wadan from Pixeleyed Pictures gives some KitPlus Show feedback!

Mo-Sys Academy has announced new course dates in the UK and Los Angeles. Spaces are limited and demand is expected to be high. Visit Mo-Sys Academy for more information and to book your place.

Mo-Sys Academy Virtual Production training achieves top marks from leading University

Mo-Sys Virtual Production training achieves top marks from latest graduates to have successfully completed their intensive hands-on practical VP training. 

Virtual Production Training
Mo-Sys Academy

Mo-Sys Academy collaborates with universities with the aim of transferring the companies unrivalled knowledge of virtual production for broadcast and film. This unique approach is geared to nurturing direct and meaningful two-way partnerships between the world-leading technology manufacturer and university teaching staff.

This successfully guides universities from the all-too-familiar media and broadcast courses, where staff may have some green screen experience, and rapidly develops an in-house virtual production knowledge base, empowering teaching staff with all the expertise and support to deliver outstanding, high-demand virtual production modules.  

“We have seen a boom in Virtual Production, and the greatest challenge facing the industry is finding people who understand the technology, with hands-on experience and knowledge of how to maximise its effectiveness,” commented Michael Geissler, CEO of Mo-Sys. “Our partnerships with education are vitally important. We are supporting universities and helping them fill this gap.” 

Curated and delivered by Mo-Sys Academy’s skilled team of virtual production on-set technicians, with an emphasis on small group practical learning in a supportive and friendly atmosphere, Mo-Sys training builds confidence and delivers valuable real-world experience. 
 
A series of modules exist from an introduction to virtual production, to a full 10-day virtual production foundation course. Staff from the University of Sunderland recently completed the foundation course. 

Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland added “This two-week course was brilliant! From the first day to the last it was packed with information and fantastic knowledge delivered by welcoming and friendly tutors in Juliette and Dominic. This was supported by experts who came into our sessions and helped us reach another level of understanding.

“I cannot recommend this course enough to university departments thinking about installing or who already have Mo-Sys technology. The course takes Virtual Production from theory into practical reality. Before the course, I had no prior experience in Virtual Production and  was extremely nervous. After the course, I feel incredibly confident about working in Virtual Production.”

Mo-Sys Academy
Nicholas Glean and other Mo-Sys Academy students receiving their certificates of completion

Hear from other attendees at one of the Mo-Sys Academy courses about their experiences in the video below:

For more information about Mo-Sys Virtual Production training, please visit Mo-Sys Academy

Mo-Sys Academy Virtual Production courses announced

Mo-Sys Academy announces new Virtual Production courses and aims to close the skills gap in the virtual production sector as it faces surging demand for trained technicians.

Mo-Sys Academy

Mo-Sys Engineering today announces that it has released a new line-up of guided Virtual Production training courses. This improved extensive programme has been carefully developed and will be delivered by Mo-Sys’ Academy at its London HQ through summer 2022.

With limited availability, demand is expected to be exceptionally high from broadcast and film industry professionals wishing to gain valuable Virtual Production experience, university lecturers upskilling and students alike for what is set to be the most comprehensive practical Virtual Production training on the planet. 

Mo-Sys Academy Virtual Production courses
Mo-Sys Academy

Multiple courses for all levels have been released starting with a 3-day introduction to Virtual Production to an intensive full Virtual Production foundation course over 10-days. Delivered by skilled on-set technicians, summer course dates start from 15th June and run until 15th August 2022. Mo-Sys’ Academy training incorporates the entire Virtual Production spectrum from green screen, augmented reality (AR), computer generated imagery (CGI), motion capture, XR and LED. Learning takes place in a supportive and friendly environment with small group creative exercises throughout. 
 
Course attendees will gain significant access to the latest Virtual Production tools and techniques, including working with the world’s leading camera tracking system, Mo-Sys StarTracker, understanding lighting requirements for green screen and LED production and discovering how to run virtual productions using Unreal Engine as part of a workflow leveraging LED volumes for in-camera visual effects (ICVFX). 
 
Demand for Virtual Production has exploded in recent years and with that, the industry requirement for experienced VP talent has grown in equal measure. Mo-Sys’ Academy has the unrivalled experience and knowledge to guide students to the forefront of the broadcast and film industry.  

“There has been a boom in Virtual Production, and the greatest challenge facing the industry is finding people who understand LED volumes, on-set pre-visualization and XR shooting. These are relatively new techniques and there is a shortage of trained technicians who understand the unique challenges that come with this new and exciting way of creating content,” commented Michael Geissler, CEO of Mo-Sys. “Mo-Sys Academy was created to address the skills bottleneck the industry is facing, and to transfer the knowledge Mo-Sys has gained over the last 25 years.” 

Mo-Sys is also working with universities, such as the University of Sunderland who recently announced a major £1.4m technology refresh. Mo-Sys partner, CJP Broadcast Services, has installed state-of-the-art Virtual Production technology, making Sunderland a powerhouse with standout media courses which will benefit students for years to come. In support of this upgrade to the latest LED volume production technology and tools, Mo-Sys Academy provided Virtual Production training for university staff. 
 
Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland added “This two-week course was brilliant! From the first day to the last it was packed with information and fantastic knowledge delivered by welcoming and friendly tutors in Juliette and Dominic. This was supported by experts who came into our sessions and helped us reach another level of understanding. I cannot recommend this course enough to university departments thinking about installing or who already have Mo-Sys technology. The course takes Virtual Production from theory into practical reality. Before the course, I had no prior experience in Virtual Production and was extremely nervous. After the course, I feel incredibly confident about working in Virtual Production.” 

For more information, please visit Mo-Sys Academy.

Creatives and Directors: What Can You Achieve with Virtual Production?

What you can achieve with Virtual Production – Mo-Sys Engineering’s commercial director Mike Grieve on how virtual production can elevate creativity and save resources

What you can achieve with Virtual Production
What you can achieve with Virtual Production

In the virtual world, possibilities are endless. Unlike real sets where you’re limited to the physical attributes of set design, virtual sets are built in Unreal Engine where you can be anywhere and have anything. Creatively, it breaks you free from budget, time and location limitations.

Stop Fixing in Post, Solve It in Pre

One of the key attributes of virtual production is the ability to pre-visualise everything before you even get to set. You can “walk” through your virtual scene and make decisions on where the best camera angles are, change lens types and adjust lighting. And with everyone from the director and producer, to the cinematographer and VFX supervisor having the ability to be together, looking at the same 3D scene from anywhere in the world, decisions can be made far more quickly and easily. So when you turn up on the day, all you need to do is light and shoot.

You don’t get that level of foresight on a physical shoot. Virtual production swaps basic preparation and fixing things in post, for high level prep by solving things in pre-production.

Not only that, but now that talent can actually see the virtual set around them – using an LED volume – rather than imagining where they need to look and interact using a green screen, you can shoot far more accurately. This helps avoid errors on things like eyelines between talent and virtual elements.

When you look at the whole production process, from pre-production to the actual deliverable, virtual production shrinks the overall production time and costs by reducing post-production needs. The bottom line is, it’s better to solve problems in pre than try to fix them in post.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Shoot In-Camera Effects in Real-Time

The quality of the 3D scene created for a virtual production shoot, is always very, very good. But when the scene is loaded into the computer stack running Unreal Engine, and camera tracking is attached, the scene, more often than not, doesn’t play back in real-time. This is because the scene can’t be processed fast enough.

When this happens, the scene needs to be ‘optimised’, which is a bit like video compression that shrinks down the file size. When the processing load goes down, the frame rate comes up allowing the scene to play back in real-time and the real-time VFX shoot to happen.

The problem then is that the quality level of Unreal scenes is fixed. Because if you try to add any more quality, the frame rate drops below real-time and you can’t shoot in-camera effects. This is a well known problem.

What normally happens is that a director or producer will then need to decide which shots will need to go to post-production for compositing to increase the quality of the background. That takes time and money. But not only that, it actually goes against the whole principle of virtual production which aims to cut down compositing time as much as possible.

At Mo-Sys, we’ve patented a solution to this called Neartime. It’s a service that runs in parallel with a real-time VFX LED shoot, that auto re-renders the background virtual scene at higher quality, enabling it to be composited back together with the keyed talent, so you can deliver a much higher quality product in the same delivery window.

So as soon as you start the camera to do the first shot, all of the tracking and lens data from the camera is thrown up into the Cloud, where that same Unreal scene that you’re shooting exists on 50 to 100 servers. Then, all the quality dials are wound up and each take is re-rendered out sequentially as the real-time shoot goes on. It allows you to deliver higher resolution background graphics, faster and automatically, to save money and time.

Production company: Made Brave in partnership with Quite Brilliant and Arts Alliance at Garden Studios

Embrace Change and Dive In

As virtual production is still fairly new for most creatives and directors, there is an element of getting used to new ways of working. Things like lighting are handled differently on a virtual set, for example. When you’ve got real talent lit by hard and soft lighting, and the LED wall with different lighting characteristics displaying the background scene, it all needs to match in order to look like part of the same set viewed from the camera perspective. Fortunately on-set colour grading is about to get a boost, which will be ‘music’ to cinematographers who have already shot in a LED volume.

At the moment, the biggest challenge lies in the quality of the Unreal scene. When you go into a virtual set, there are two types of video that you display on the LED wall. One of them is video plate playback which is used for things like car scenes where the vehicle is moving quickly down a street. The car is static in the virtual studio but the video is moving. Those scenes are very high quality because they are shot with multiple high quality cameras on a rig designed to capture a rolling 360 degree view.

But then you have the Unreal scene using virtual graphics. This is where you need camera tracking on the real camera to match it to the virtual scene displayed on the wall. The quality of these virtual graphics is very good but it’s not quite as good post-production compositing just yet. This is where our NearTime technology can help.

And finally, you’ve got the challenge of continuity when changing elements or editing Unreal scenes live on set. Imagine you’re on a virtual set and suddenly you decide that you want to move one of the objects on the LED volume to the other side of the screen. When you change something, you need to log what you’ve changed as it always has a down-stream impact on the shoot, and it can cause issues if you need to then remember what other scenes need updating as a result. This is something Mo-Sys is working on solving very soon, with technology that allows on-set real-time editing of Unreal scenes that automatically captures and logs the revisions. Watch this space!

What is Motion Capture Acting?

When the unmistakably quirky Gollum first appeared on our screens back in 2001, audiences were taken by his unique character conveyed purely through CGI. Despite the fact that MoCap had already been around for a while, this moment signified the shifting of Motion Capture acting into the mainstream. 

Andy Serkis’ take on Tolkien’s now infamous Lord of the Rings character provided the movie industry with a true taste of the potential of Motion Capture Acting. The result was a complete revolution. Not just for the film industry, but for creating realistic and cinematic gaming experiences as well as for use in sports therapy, farming and healthcare.

From Middle Earth to Pandora: How motion capture works

In the last two decades we’ve watched numerous ‘behind the scenes’ images of Serkis jumping around in a MoCap suit equipped with retroreflective markers. During the filming of Peter Jackson’s Lord of the Rings (2001), these retro reflective optical markers allowed motion capture technology to accurately record his facial and body movements through a series of motion tracking cameras. This data was then transferred to a graphics engine to create a ‘skeleton’ which acted as a base for the animated body of Gollum. 

This early MoCap technique, which is still used today by some, is known as an ‘outside-in’ system; this means that the cameras look from a perspective ‘outside’ of the MoCap environment, and into the movement of the actor. The second, more recent technique (which we’ll explain further below), involves the use of Inertial Measurement Units (IMUs) to capture an actors movement regardless of any space or location (Xsens’ MVN system is an example of this type of MoCap setup). 

Performance capture  

Seeing the potential of the technology to enhance productions, a number of companies have since  invested in technology that more accurately records facial, hand, and finger motion. Known informally as ‘performance capture’ tech, these more targeted systems give motion-caption actors a greater degree of creative freedom whilst improving the credibility of their CGI digital characters. 

And their increased use in film production has not gone unnoticed. James Cameron’s Avatar (2001), for instance, was highlighted by critics for its innovative use of performance capture tech when creating the ethereal Na’vi. Matt Reeves’s War of the Planet of the Apes (2017), furthermore, was praised for its use of facial motion capture; Andy Serkis, who played the leading role, wore 132 mini retro-reflective markers on his face in order to record the exact movement of his facial muscles. 

Free-environment motion capture 

In 2011, Mo-Cap was upgraded again so it could be taken out of the studio and used on location. This began with Rise of the Planet of the Apes (2011), and allowed actors to fully immerse themselves in mixed reality environments whilst giving producers unparalleled creative freedom. Shooting outside also meant adapting the technology to varying climates. Mo-cap motion capture suits were made much more robust with the reflective markers upgraded to infrared pulse so directors could film in bright light (an example of active optical motion capture). 

The production teams of Dawn of the Planet of the Apes (2014) and War of the Planet of the Apes (2017) took the tech even further, filming in humid conditions and at night. This came alongside advancements in the rendering textures of fur, skin and eyes, allowing audiences to enjoy the most cinematically-gripping visuals of photo-realistic simulations. 

The magic of Motion Capture Acting

The beauty behind motion capture as a form of acting lies in the emphasis on physicality and embodied movement and expression. In an interview with WIRED Serkis explains, “It’s not just about mimicking behaviour. This is about creating a character.” This includes developing the psychological and emotional journeys to pour into the character. “It’s not just about the physical build of the character, but also the internalisation,”explains the actor. “That’s why the misconception that performance capture acting is a genre or type of acting is completely wrong, It’s no different to any process you go through to create a role… the actor’s performance is the actor’s performance.”

The combination of professional acting skills with advancements in Mo-cap technology has led to the development of many memorable CGI characters in recent years. From Serkis’s Portrayal of Caesar in Planet of the Apes to Benedict Cumberbatch and his unique take on the dragon Smaug in The Hobbit, motion capture is giving actors more powerful tools to portray characters and ultimately, enhance storytelling. 


If you’re interested in learning more about the future of the film industry, check back regularly for more articles from Mo-Sys Academy. Drawing from years of experience in virtual production for film and tv, and as one of the UK’s leading camera tracking suppliers, we’re aiming to educate the next generation of producers.  

What is a jib camera?

A jib camera is simply a camera mounted on a jib, which is a boom or crane device. On the other end of the jib, there’s a counterweight and either manual or automatic controls to direct the position of the camera. The main benefit of using a jib is that it allows camera operators to get shots that would otherwise be difficult to obtain due to obstacles or awkward positions. 

Different types of jib 

Jibs come in many sizes; from small jibs for handheld cameras to enormous booms that can pan over the heads of huge festival crowds. Regardless of the size, however, the purpose remains the same. A camera jib is there to help provide producers with stable crane shots. 

What is a crane shot? 

A crane shot is shot taken from a jib camera. Whilst most jibs can move in all directions, they’re valued primarily for their ability to move on a vertical plane. This gives producers the opportunity to emphasise the scale of a set, and is often used either to introduce or close a setting in a film. 

Crane shot examples 

La La Land (2017) 

The opening scene of Damien Chazelle’s Oscar-nominated La La Land was shot with the use of a camera jib. The scene presented various challenges to camera technicians as the shot weaves around stationary cars and dancers. An added complication was that the freeway it was filmed on was actually slanted, creating panning perspective problems. Regardless, the end result was a success –  the scene set the tone for the rest of the film whilst introducing Los Angeles, the central location of the narrative. 

Once Upon a Time in Hollywood (2019)

Quentin Tarrantino is well-known for his use of jibs for panoramic and tracking shots. Most recently, he used them in Once Upon a Time in Hollywood (2019) to place characters in context and add atmosphere. At the end of the ‘Rick’s house’ scene, a large jib camera slowly pans out from across the top of a Hollywood home to reveal the neighbourhood’s quiet night time roads. 

Camera heads for jibs 

To achieve shots like these, operators need to be able to move and adjust the camera at the end of the jib. This can be done either manually through a series of pull-wheels, or automatically with a controller. Either way allows operators to pan, tilt, and zoom the camera.

Camera jibs for virtual production 

Jibs used for virtual production either need to have all axes encoded, or have a tracking system attached to them. This is required in order to capture camera movement data in order that the virtual elements of a shot can be made to move in exactly the same way as the real camera shot. When it comes to virtual production, which jib you decide to use is extremely important. This is because any unintended movement (i.e. any unencoded or untracked movement) caused by the jib can cause virtual images to ‘float’ and break the illusion. To counter this, VP jibs need to be heavier, sturdier, and more rigid. Mo-Sys’s e-Crane and Robojib were designed specifically with these needs in mind – catering to a growing trend in Virtual Production (VP), Extended reality (XR), and augmented reality (AR). 

Mo-Sys Academy is committed to sharing the latest developments in film and broadcasting technology with those looking to enter the field. If you’re interested in learning more, check out our previous articles explaining what a film crew does, and the difference between AR and VR broadcasting

What is extended reality (XR)?

Extended reality (XR) is a term that is commonly used to describe all environments and interactions that combine real and virtual elements. Whilst XR usually encompasses AR (augmented reality), MR (mixed reality) and VR (virtual reality), it has a more specific meaning when used in relation to film, broadcast, and live entertainment production. In this article, we explain how, and why it’s on course to become a studio staple. 

Extended Reality meaning 

When used as an umbrella term, XR denotes all AR, MR and VR technologies –it’s the overarching label given to systems that integrate virtual and real worlds. In this sense, XR can be applied equally to certain motion capture techniques, augmented reality applications, or VR gaming. 

In the production world, however, it means something much more specific. XR production refers to a workflow that comprises LED screens, camera tracking systems, and powerful graphics engines. Often the LED wall operate with set extensions, which are tracked AR masks that allow the virtual scene to extend seamlessly beyond the LED wall.

How does XR production work? 

In XR production, a pre-configured 3D virtual environment generated by the graphics engine is displayed on one (or across multiple) high-quality LED screens that form the background to live-action, real world events.  When combined with a precision motion tracking system, cameras are able to move in and around the virtual environment, with the real and virtual elements seamlessly merged and locked together creating the combined illusion.

The benefits of XR production 

Immersive production and real time edits 

Immersive technology enables actors, hosts, and producers to see the virtual environments whilst shooting. This means that they can adapt their performances or make edits live on set, which reduces time (and budget) spent in post-production.  

Lighting 

Lighting is provided by the LED screens on an XR set. Common illumination helps real-world people and objects blend seamlessly into virtual environments, and further reduces time on set adjusting lighting. 

No Colour Spill or Chroma Key Compositing 

On certain green screen setups, colour spill and the need for chroma key compositing can increase the time spent in post production. Neither is required for XR screens which, again, reduces any time needed in post-production. 

Rapid calibration 

The calibration of camera tracking systems on xR sets takes minutes rather than hours (as can happen with green screen sets). This allows for scenes to be shot across multiple sessions with minimal disruption and preparation. However, where set extensions are used or AR objects are added, more precise calibration would still be required.

Examples of XR in production 

The Mandalorian 

Having experimented with XR sets in the making of The Lion King (2019), producer Jon Favreau used them to complete half of all scenes for his production of Disney’s The Mandalorian (2019). With a twenty-foot tall LED screen wall that spanned 270°, Favreau’s team filmed scenes across a range of environments, from frozen planets to barren deserts and the insides of intergalactic spaceships. Apart from giving the cast and crew a photo-realistic backdrop to work against, the XR set saved a significant amount of time in production. Rather than continually changing location and studio setup, the design team could rapidly switch out props and partial sets from inside the 75-foot diameter of the volume. 

Dave at the BRIT Awards 

At the 2020 BRIT Awards, an XR setup was used to enhance Dave’s performance of his single ‘Black’. In a collaboration between Mo-Sys, Notch, and Disguise, a 3D animation was projected onto Dave’s piano, giving live audiences around the world an engaging visual experience. With effective camera tracking provided by Mo-Sys Startracker, camera operators could move freely around the stage with zero movement or disturbance in the piano’s moving images. 

HP OMEN Challenge 2019 

With the rise of eSports, developers are exploring new ways to enhance gaming and bring immersive experiences to ever larger audiences. In 2019, HP did this by broadcasting their OMEN eSports tournament live from an XR stage. There were two main benefits of using extended reality: firstly, audiences around the world could immerse themselves in the virtual environments of the game; secondly, gamers in the studio could review their gameplay from ‘within’ the game. The end result was an interactive, immersive experience that blurred the lines between the real and virtual world. 

Mo-Sys Academy is committed to sharing the latest developments in film and broadcasting technology with those looking to enter the field. If you’re interested in learning more, check out our previous articles explaining what a film crew does, and the difference between AR and VR broadcasting

What does a film crew do?

As film-making has got more technologically-advanced, film crews have changed in composition. Whilst the core remains the same (directors, producers, technicians, and camera operators) – there are many more roles which have opened up as a result of virtual production (VP). If you’re considering a career within the film industry and are wondering, ‘what does a film crew do?’, read on to learn about the main positions, and how they’re changing as VP progresses.  

Film crew roles explained 

Producer 

Producers are the driving force behind productions. They oversee projects from start to finish, making decisions about key concepts, creative processes, and finances. Whilst they’re mostly focused on organisation and operational functions, they also hold sway over script selection, directing, and editing. 

Director 

Directors, on the other hand, are much more involved in the creative side of filmmaking. They set the artistic direction of the production, and guide the technicians on how to achieve it. Alongside deciding on shots and angles, directors oversee casting, set design, and musical score. 

Scriptwriter 

Scriptwriters are often the starting point of film productions. They provide the initial idea of a story, and craft it into a compelling narrative. They shape characters, give them a voice, and make them believable. Although scriptwriters are often in the shadow of producers and directors, they’re a fundamental part of the filmmaking process. 

Production designer 

Working closely with producers and directors, production designers are responsible for the visual concept of a film. This covers location spotting, set design, costume design, lighting, and visual effects. As VP becomes more popular, moreover, production designers are collaborating more with graphic designers and VFX artists in the pre-production phase – ultimately, shortening time and money spent in post-production. 

Production manager 

Reporting to the producer, production managers deal with the day-to-day running of film projects. They’re responsible for hiring film crew members, managing running budgets, and organising shooting schedules. You might consider them the on-the-ground project managers of the filmmaking process. 

Cinematographer/ Director of Photography (DP)

The DP transforms the director’s vision into reality through their technical knowledge. Traditionally, they advise on which cameras, lenses, filters, and stock to use to achieve the desired shots. Similarly to production designers, their role is increasingly influenced by VP. They need to be aware of how virtual production systems integrate with standard studio equipment such as lighting and rigging, and how to set up and calibrate VP gear. 

Focus puller 

In traditional filmmaking, a focus puller works alongside a camera operator to manually bring actors and objects into focus at the right time. Moving with the camera, they adjust the lens to according to the distance between themselves and the object they want to focus on. They may put markers down before filming (like sellotape on the floor of the set) to help them, or they may just rely on their own spatial awareness as the camera is rolling. Whilst their role is likely to change with the increasing use of VP methods, they’ll remain incredibly important on set. The production team of Jon Favreu’s The Lion King (2019), showed how focus pullers can be integrated into a new way of multi-track filming methods

Director of Virtual Production (DVP)

As VP progresses, however, it’s becoming more common to hire a separate Director of Virtual Production (DVP), who concentrates solely on VP during a project. Although the role is developing along with the tech, DVP’s are generally responsible for managing virtual props, dealing with VFX vendors,  overseeing the pre-production phase, and transferring assets to graphic engines.

Digital Image Technician (DIT)

In a broad sense, the DIT ensures that the highest technical standards are maintained when filming. As experts of the latest camera technology and associated software, they advise the DP and DVP on any issues relating to digital (rather than film) recording, including contrasts, exposure, framing, and focus. They’re also responsible that all film data is stored and managed correctly, making regular backups and transferring into file types that are accessible to other departments in post production. 

Films crews and changing technology 

As VP technology advances, the traditional roles within film crews are adapting. Whilst directors, producers, and DPs are becoming more aware of the advantages of VP – including greater creative freedom, fewer resources required, and less time needed – new roles are emerging to operate highly specialised VP tech. Ultimately, the days of large film crews moving between locations like travelling circuses are gone – replaced with remote, more agile and technically-enabled teams. 

Want to learn more about virtual production and the film industry? Take a look at articles from Mo-Sys Academy. We’re here to help educate and inform those looking to make their first steps into the world of filmmaking.

What is motion capture and how does it work?

Motion capture (Mo-cap) refers to a group of technologies that records the movements of people and objects, and transfers the corresponding data to another application. It’s been used for many purposes, from sports therapy, farming, and healthcare, to film and gaming. By mapping real-world movement on computer generated frames, motion capture allows for photorealistic dynamics in a virtual environment. Here’s how it developed and how it works. 

The birth of mo-cap 

The first major step in the development of mocap was brought by American animator, Lee Harrison III, in the 1960s. Using a series of analogue circuits, cathode ray tubes, and adjustable resistors, Harrison devised a system that could record and animate a person’s movement in real-time. 

Harrison’s Animac and Scanimate technology was developed in the late 1960s and allowed real-time animations to be created and processed by a computer. With Animac, actors would wear what was described as an ‘electrical harness’ or ‘data suit’ wired up to a computer. Using potentiometers that picked up movements attached to the suit, an actor’s movements could be translated into crude animations on a monitor.

Though the result was fairly basic, it was soon being utilised in various TV shows and advertisements across the States. The abstract images that could be produced with the rudimentary mocap technology of Animac and Scanimate, however, just weren’t good enough to attract mainstream attention.

The development of mo-cap 

The following decades saw improvements on Harrison’s designs, with bodysuits more accurately recording movement. They were also helped by the development of large tracking cameras; as useful as they were, however, each was about the size of a fridge. 

While mocap had been used sparingly in the 1980s and 1990s with films like American Pop (1981) and Cool World (1992), the first film to be used entirely using the technology was Sinbad: Beyond the Veil of Mists (2000). The film was a flop, but its use of mocap was picked up and expanded on by Peter Jackson in his making of The Lord of the Rings Trilogy in the early 2000s. 

For the first time ever, actors wearing their bodysuits (complete with retroreflective ping-pong balls) could perform alongside their non-animated colleagues in the same scene. Among CG-created characters, The Lord of the Rings’ Gollum is recognised as one of the most impressive Hollywood has ever produced. The combination of the character’s voice and intricate facial expressions performed by Andy Serkis resulted in an unforgettable motion-capture performance. The character and technology were created on the fly by Weta Digital’s Bay Raitt.

Facial capture 

With an increasing awareness of how motion capture techniques can enhance productions, more attention has been given specifically to facial capture. A number of companies have developed highly accurate systems, which, when paired with powerful graphics engines, result in like-like, photo-realistic facial images. Cubic Motion (now partnered with Unreal Engine) is one of these, and has worked on a number of high-profile games and virtual experiences, including Spiderman, League of Legends, Apex Legends. 

Motion capture techniques 

Nowadays, there are four main motion capture techniques: 

  • Optical (passive) – With this technique, retroreflective markers are attached to bodies or objects, and reflect light generated from near the camera lens. Once reflected, the light is used to calculate the position of the markers within a three-dimensional space, and recorded. 
  • Optical (active) – This technique is exactly the same, but the markers emit light rather than reflect them. The markers therefore require a power source. 
  • Marker-less – This technique doesn’t require markers of any sort. It relies on depth-sensitive cameras and specialised software in order to track and record moving people and objects. Whilst more convenient in some ways, it’s generally considered less accurate than its optical or mechanical-tracking alternatives. 
  • Inertial – This technique doesn’t necessarily need cameras to operate. It records movement through IMUs (inertial measurement units), which contain sensors to measure rotational rates. The most common sensors used in IMUs are gyroscopes, magnetometers, and accelerometers. 

What is motion tracking used for? 

Motion tracking and capture has a broad range of uses across various industries, including: 

  • Film and Gaming – Motion capture is used to record the movement of actors and transfer them onto virtual or computer-augmented characters.
  • Sports Therapy and Healthcare – Health professionals can use motion capture to analyse the movement of patients and diagnose problems i.e gait analysis. 
  • Military – When combined with virtual reality, motion capture technologies have been used to enhance military training experiences. 

If you’re interested in learning more about the future of the film industry, check back regularly for more articles from Mo-Sys Academy. Drawing from years of experience in virtual production for film and tv, and as one of the UK’s leading camera tracking suppliers, we’re aiming to educate the next generation of producers.