Mo-Sys Enables Spectacle for Live Music Extravaganza

Little Mix put into virtual stadium setting thanks to precision camera tracking

Mo-Sys took a central role in the Covid-safe production of a live online entertainment awards show. Normally held in a stadium venue, this year production company and media technologists Bild Studios used virtual reality with the intent of keeping the same scope and excitement as any physical show in the past.

Under the lead of Creative Director Paul Caslin and Production Designer Julio Himede from Yellow Studio, Bild created an enormous 360 virtual stadium cladded with LED screens all around – a design that wouldn’t have been achievable in the real world but more than possible in a virtual environment. Bild’s unique content workflows allowed the show content applied to the virtual LED screens to be operated live, on a cue per cue basis – just like in a physical environment. Hosts Little Mix were shot in a large green screen studio, which thanks to Mo-Sys VP Pro virtual studio software combined with Unreal Engine graphics, became filled with cheering music fans. The links were shot with three Arri cameras: one on a crane, one on a rail and one on a tripod.

Each camera was fitted with a Mo-Sys StarTracker camera tracking system. These track and capture a camera’s precise position and orientation in six axes thanks to infrared cameras tracking reflective dots – stars – on the ceiling. By also tracking the lens focus, zoom, and distortion, the live talent can be perfectly and seamlessly combined with the virtual world, irrespective of the camera movement or lens adjustment. All composited camera outputs were created in real-time, ready for the director to live-cut to, making the camera work for the virtual production environment not too dissimilar for any other live broadcast show.

“Producing the real-time VFX content and suppling the technical engineering for this event was a very great pleasure,” said David Bajt, Co-Founder & Director at Bild Studios. “Special thanks go to Mo-Sys Engineering for the amazing camera tracking and virtual production technology.”

Mo-Sys CTO James Uren, who acted as technical consultant on the production, explained, “This was a very fast turnaround production – just five days for any post-production work required – so we needed to get all the virtual studio materials in camera. Because we capture all the component video elements (green screen, key, and graphics for each camera) plus all camera movement and lens data for each camera, for maximum flexibility we could offer Bild the option of three workflows.

“First, we had the combined real and virtual elements from the three cameras, with the option of cleaning up the greenscreen in post,” he explained. “Second, if there had been a problem with the graphics, we could keep the live shots and replace the virtual background. And third, the post team could just go crazy and change everything, whilst still keeping the same camera/lens movements around Little Mix.”

Mo-Sys VP Pro is powerful software that plugs directly into the Unreal Engine Editor. It receives the camera and lens tracking data from StarTracker, synchronises this with the camera video and the Unreal Engine virtual graphics, and frame accurately merges the two image streams together to provide a real-time high quality composite output. VP Pro’s uncompromised graphics power was designed with the specific needs of live virtual production in mind. For the event, each Arri camera had its own VP Pro and Unreal Engine instance. This gave the producers the output quality they demanded, with very low latency. “This is not pre-visualisation,” Uren explained. “This is premium production quality.”

Virtual Production Demonstration

Rising Sun Pictures recently took part in a proof-of-concept demonstration showcasing leading-edge virtual production technology including Mo-Sys camera tracking software.

The South Australian film and television production industry are defying time, space and budget blowouts, all without leaving the studio. South Australian business Novatech, in partnership with Intraware and South Australian post-production, visual effects company Rising Sun Pictures recently collaborated to give a hands-on demonstration showcasing the most recent innovations in virtual production technology. The South Australian Film Corporation hosted the event at its Adelaide Studios facilities, featuring a purpose-built 12m x 3.6m curved LED-wall, with real-time virtual set generation by disguise and Mo-Sys camera tracking. Participants gained a first-hand insight into technology that integrates real-time computer-generated backgrounds with live performance and integrated foreground lighting.

Read more on the Novatech website and on Shoot Publicity Wire.

Top banner photo credits: David Solm & Ian Cope

The Reality of the Situation

“It means that a studio can be readily converted from a product launch to a dealer webinar to a news broadcast with just a press of a button.” CEO Michael Geissler talks to TVB Europe about our innovative StarTracker camera tracking technology.

Even before the changes in production that were needed to cope with the effects of Covid-19, the family of realities – augmented reality, virtual reality and mixed realities – were becoming more and more common on the broadcast landscape. Recent events have seen new opportunities for the technology. Philip Stevens takes a look at a few of the recent developments in the ‘realities’ field and gets immersed in a virtual environment in this article in the latest edition of TVB Europe.

The article takes an in-depth look at the type of equipment required by virtual production studio. It describes how Mo-Sys StarTracker revolutionised VR and AR production and describes in detail the components of the new StarTracker Studio, the first preassembled production package that brings virtual 4K Ultra HD production within the reach of just about any studio which was launched in July this year.

“It means that a studio can be readily converted from a product launch to a dealer webinar to a news broadcast with just a press of a button,” explains Michael Geissler, CEO of Mo-Sys.

“The system is for film as well as TV workflows allowing filming of commercials, instant social media responses. Like the original system, it tracks the position of each camera in three-dimensional space across all six axes of movement. But beyond that, StarTracker Studio bundles the tracking with cameras and mounts and a high performance virtual graphics system based on the latest version of the Unreal Engine and the Ultimatte keyer. The unique plug-in interfaces directly between the camera tracking and the Unreal Engine, for extreme precision with virtually no latency.”

Read the article in full in the TVB Europe November/December issue below:

TVB Europe Mo-Sys StarTracker Studio
TVB EUROPE 78 NOVEMBER / DECEMBER 2020

Top Banner Image: The On-Set Facilities virtual production crew on a film
set in Cardiff, real-time compositing with Unreal Engine.

Extended Reality in the Virtual Environment

EVOKE Studios talks to TVB Europe about how they used our StarTracker “set and forget” solution at the AIM Awards 2020 to provide precise camera tracking and build complex extended reality environments.

EVOKE Studios is a new venture which brings together established designers and engineers to extend what is possible on stage. Their founder Vincent Steenhoek talks to TVB Europe about how the EVOKE team used the Mo-Sys StarTracker ‘set and forget’ camera tracking solution at this year’s AIM Awards, from the Association for Independent Music, on the 12th August 2020.

“Planning a project like the AIM Awards means we need to do considerable testing and plan with enough contingency. It also means that we need the supporting technology – like camera tracking – to just work. It has to be completely stable, no questions asked.

“That is why we used StarTracker from Mo-Sys. It is a ‘set and forget’ solution which uses dots on the ceiling that light up under ultra violet light; that’s why they call it StarTracker; once the star pattern is captured, it is there forever.

“We made the AIM videos at the new Deep Space studios at Creative Technology in Crawley. They have StarTracker set up on a jib, with all the connections to the lens and grip. What StarTracker gives us is a constant stream of extremely precise camera locations in three dimensional space, along with camera rotation and – from lens sensors – zoom and aperture (for depth of field). We used its translation, rotation and lens data inside the graphics servers (from disguise) to drive virtual cameras in Notch.

“Our experience with StarTracker is that it gives us ultra-reliable, highly accurate positional data with no encoder drift to speak of, and ultra low latency so we can create virtual elements and match them to the scene captured live by the camera. As XR consists of a lot of moving parts and separate calibrations, it helps a lot if there are parts of the process that are a constant.

“For EVOKE as a creative studio and systems integrator, we are enabled to do what we do by companies like MoSys. In turn, building on technologies like StarTracker enables awards shows like the AIMs to be presented in broadcast quality virtual environments.”

Read the article in full in the TVB Europe November/December issue below:

TVB Europe 78 November / December 2020  

Top Banner Image: Opening beach environment for Little Simz performing Pressure

US elections by TV2 Denmark

TV2 Denmark created an immersive experience for viewers of the US elections 2020 using a greenscreen UE4 studio set with data-driven graphics featuring over several virtual locations, giving viewers a ‘sense of being there’.

The virtual reality graphics were created using Mo-Sys StarTracker technology and the Unreal Engine plugin, Mo-Sys VP Pro.

Check out the short video featuring the highlights from the 24-hours broadcast.

What is Motion Capture Acting?

When the unmistakably quirky Gollum first appeared on our screens back in 2001, audiences were taken by his unique character conveyed purely through CGI. Despite the fact that MoCap had already been around for a while, this moment signified the shifting of Motion Capture acting into the mainstream. 

Andy Serkis’ take on Tolkien’s now infamous Lord of the Rings character provided the movie industry with a true taste of the potential of Motion Capture Acting. The result was a complete revolution. Not just for the film industry, but for creating realistic and cinematic gaming experiences as well as for use in sports therapy, farming and healthcare.

From Middle Earth to Pandora: How motion capture works

In the last two decades we’ve watched numerous ‘behind the scenes’ images of Serkis jumping around in a MoCap suit equipped with retroreflective markers. During the filming of Peter Jackson’s Lord of the Rings (2001), these retro reflective optical markers allowed motion capture technology to accurately record his facial and body movements through a series of motion tracking cameras. This data was then transferred to a graphics engine to create a ‘skeleton’ which acted as a base for the animated body of Gollum. 

This early MoCap technique, which is still used today by some, is known as an ‘outside-in’ system; this means that the cameras look from a perspective ‘outside’ of the MoCap environment, and into the movement of the actor. The second, more recent technique (which we’ll explain further below), involves the use of Inertial Measurement Units (IMUs) to capture an actors movement regardless of any space or location (Xsens’ MVN system is an example of this type of MoCap setup). 

Performance capture  

Seeing the potential of the technology to enhance productions, a number of companies have since  invested in technology that more accurately records facial, hand, and finger motion. Known informally as ‘performance capture’ tech, these more targeted systems give motion-caption actors a greater degree of creative freedom whilst improving the credibility of their CGI digital characters. 

And their increased use in film production has not gone unnoticed. James Cameron’s Avatar (2001), for instance, was highlighted by critics for its innovative use of performance capture tech when creating the ethereal Na’vi. Matt Reeves’s War of the Planet of the Apes (2017), furthermore, was praised for its use of facial motion capture; Andy Serkis, who played the leading role, wore 132 mini retro-reflective markers on his face in order to record the exact movement of his facial muscles. 

Free-environment motion capture 

In 2011, Mo-Cap was upgraded again so it could be taken out of the studio and used on location. This began with Rise of the Planet of the Apes (2011), and allowed actors to fully immerse themselves in mixed reality environments whilst giving producers unparalleled creative freedom. Shooting outside also meant adapting the technology to varying climates. Mo-cap motion capture suits were made much more robust with the reflective markers upgraded to infrared pulse so directors could film in bright light (an example of active optical motion capture). 

The production teams of Dawn of the Planet of the Apes (2014) and War of the Planet of the Apes (2017) took the tech even further, filming in humid conditions and at night. This came alongside advancements in the rendering textures of fur, skin and eyes, allowing audiences to enjoy the most cinematically-gripping visuals of photo-realistic simulations. 

The magic of Motion Capture Acting

The beauty behind motion capture as a form of acting lies in the emphasis on physicality and embodied movement and expression. In an interview with WIRED Serkis explains, “It’s not just about mimicking behaviour. This is about creating a character.” This includes developing the psychological and emotional journeys to pour into the character. “It’s not just about the physical build of the character, but also the internalisation,”explains the actor. “That’s why the misconception that performance capture acting is a genre or type of acting is completely wrong, It’s no different to any process you go through to create a role… the actor’s performance is the actor’s performance.”

The combination of professional acting skills with advancements in Mo-cap technology has led to the development of many memorable CGI characters in recent years. From Serkis’s Portrayal of Caesar in Planet of the Apes to Benedict Cumberbatch and his unique take on the dragon Smaug in The Hobbit, motion capture is giving actors more powerful tools to portray characters and ultimately, enhance storytelling. 


If you’re interested in learning more about the future of the film industry, check back regularly for more articles from Mo-Sys Academy. Drawing from years of experience in virtual production for film and tv, and as one of the UK’s leading camera tracking suppliers, we’re aiming to educate the next generation of producers.  

Visualising the AR Roadmap

Commercial Director Mike Grieve talks to BroadcastPro Middle East about our StarTracker technology and explores the impact augmented reality is having on the broadcast industry.

Augmented Reality (AR) has brought new life to traditional screens by enabling broadcasters to extend their storytelling, thereby changing how consumers interact with content. BroadcastProME asks industry professionals to share their vision of how they foresee AR changing the face of broadcasting in various scenarios and how their own respective solutions can enhance the viewing experience.

Mike Grieve
Commercial Director, Mo-Sys

AR graphics are already used extensively in news, sports and weather storytelling. Children’s TV and mainstream drama, content that can be sold multiple times over to other broadcasters is probably where AR graphics provide the greatest return, both in terms of producing content that otherwise wouldn’t be possible, or if possible, it would be cost-prohibitive to make.

Mo-Sys manufactures StarTracker, a precision camera and lens tracking system, which is used to blend AR graphics with the real world. The system is also used for virtual studios, mixed reality, and extended reality. Mo-Sys also manufactures an Unreal Engine plugin called VP Pro, which enables all types of virtual production, and a range of encoded remote camera heads for wider, more varied AR shots.

AR graphics primarily require all the cameras used to be tracked, and all lenses on the cameras to have been profiled. Once this is done, one can choose which virtual studio software to use to synchronise and composite the AR graphics with the real world. Either a traditional virtual studio software package with automation and playout triggering, or where this isn’t required, an Unreal Engine plugin will work.

The biggest decision to make is whether the graphics operations team should be experts in Unreal, or experts in a traditional broadcast virtual studio software. This will determine the type of software that can be used to deliver the AR graphics. Choosing the camera, lens tracking system, and the camera grip comes after.

In terms of where AR is headed, greater photo-realism using technologies such as ray tracing is the obvious one. We will also begin to see more photo-realistic avatars, human or otherwise, driven by actors in motion capture suits with facial expression headsets, interacting with real actors.

The aim of broadcasters deploying AR is to create highly immersive content that’s visually appealing, which is ‘sticky’ in terms of viewer numbers and viewer duration, whilst also providing differentiation from the competition. The longer-term goal is for broadcasters to use AR to create increasingly sophisticated photo-realistic content that wouldn’t otherwise be possible.

BroadcastPro Middle East 124 November 2020

What is a jib camera?

A jib camera is simply a camera mounted on a jib, which is a boom or crane device. On the other end of the jib, there’s a counterweight and either manual or automatic controls to direct the position of the camera. The main benefit of using a jib is that it allows camera operators to get shots that would otherwise be difficult to obtain due to obstacles or awkward positions. 

Different types of jib 

Jibs come in many sizes; from small jibs for handheld cameras to enormous booms that can pan over the heads of huge festival crowds. Regardless of the size, however, the purpose remains the same. A camera jib is there to help provide producers with stable crane shots. 

What is a crane shot? 

A crane shot is shot taken from a jib camera. Whilst most jibs can move in all directions, they’re valued primarily for their ability to move on a vertical plane. This gives producers the opportunity to emphasise the scale of a set, and is often used either to introduce or close a setting in a film. 

Crane shot examples 

La La Land (2017) 

The opening scene of Damien Chazelle’s Oscar-nominated La La Land was shot with the use of a camera jib. The scene presented various challenges to camera technicians as the shot weaves around stationary cars and dancers. An added complication was that the freeway it was filmed on was actually slanted, creating panning perspective problems. Regardless, the end result was a success –  the scene set the tone for the rest of the film whilst introducing Los Angeles, the central location of the narrative. 

Once Upon a Time in Hollywood (2019)

Quentin Tarrantino is well-known for his use of jibs for panoramic and tracking shots. Most recently, he used them in Once Upon a Time in Hollywood (2019) to place characters in context and add atmosphere. At the end of the ‘Rick’s house’ scene, a large jib camera slowly pans out from across the top of a Hollywood home to reveal the neighbourhood’s quiet night time roads. 

Camera heads for jibs 

To achieve shots like these, operators need to be able to move and adjust the camera at the end of the jib. This can be done either manually through a series of pull-wheels, or automatically with a controller. Either way allows operators to pan, tilt, and zoom the camera.

Camera jibs for virtual production 

Jibs used for virtual production either need to have all axes encoded, or have a tracking system attached to them. This is required in order to capture camera movement data in order that the virtual elements of a shot can be made to move in exactly the same way as the real camera shot. When it comes to virtual production, which jib you decide to use is extremely important. This is because any unintended movement (i.e. any unencoded or untracked movement) caused by the jib can cause virtual images to ‘float’ and break the illusion. To counter this, VP jibs need to be heavier, sturdier, and more rigid. Mo-Sys’s e-Crane and Robojib were designed specifically with these needs in mind – catering to a growing trend in Virtual Production (VP), Extended reality (XR), and augmented reality (AR). 

Mo-Sys Academy is committed to sharing the latest developments in film and broadcasting technology with those looking to enter the field. If you’re interested in learning more, check out our previous articles explaining what a film crew does, and the difference between AR and VR broadcasting