Extended Reality: Mo-Sys and Netick Group Evolution breaking down the fourth wall

Whether used in film, live broadcast, or corporate conferences, Mo-Sys’s technology helps production teams to create an immersive experience for the audience. In the latest edition of our on-going LiveLab series, we spoke to Netick Group Evolution and Disguise about how they’re using our StarTracker system as part of an engaging LED wall workflow. In case you missed it, here are the main takeaways. 

xR – The future of production? 

Italian-based company, Netick Group Evolution, uses Mo-Sys’ StarTracker system as part of their xR (extended reality) stage, which can be used across film, broadcast, and in-person live events. 

Their LED wall workflow draws on three separate systems in order to produce an augmented set that’s as enveloping for the audience as it is usable for hosts, presenters, and actors. The three systems that overlap to provide this solution are camera tracking (by Mo-Sys), graphics engine (by Notch), and media server playback (by disguise). 

By seamlessly linking up these three components and pairing with a high-quality LED wall, Netick offers producers an incredibly versatile and reliable tool with which to create virtually-enhanced production. 

An alternative to green screen…

An xR set is a viable alternative to using a green screen for virtual production. In essence, such a set simply comprises three LED walls at right-angles, forming a corner space. This area then serves as the central position for the live presenters or actors, with the background being displayed on the LED screens behind and extended beyond using virtual production. 

In order to reliably track the absolute position of cameras in the space, the Mo-Sys StarTracker system is used. Retro-Reflective ‘stars’ are attached to the studio ceiling (or floor) and a sensor is attached to the top of the camera. Within a matter of minutes the cameras are calibrated to the LED walls and virtual environment, and shooting can commence. 

While StarTracker can work with both green screen and LED walls, the latter offers some interesting benefits: 

  • Actors and presenters can actually see and respond to virtual environment elements displayed on the LED screens. 
  • The lights, shadows, and reflections rendered on the LED screens are real.
  • There’s no ‘green spill’ around actors or presenters 
  • There’s no need for ‘comfort monitors’ in studios, reducing costs and increasing space. 

The uses of xR sets 

Aside from representing a viable alternative to green screens in professional studios, xR sets may also prove particularly useful to a range of other sectors in the coming months. With social distancing rules likely to be maintained for the foreseeable future, they could be utilised in everything from live university lectures to corporate presentations and religious ceremonies. By allowing presenters to visualise and interact with a computer-generated environment, xR sets provide a convincing, engaging way of communicating with different types of audiences. 

To watch the webinar in full, please click here.

If you’re interested in the potential of xR sets for your production, please get in contact with one of our expert technicians. They’ll be able to advise on your specific needs and provide a bespoke solution that allows you to create engaging, computer-augmented broadcasts. 

Tracking camera movement with Panasonic’s AW-UE150: A Mo-Sys collaboration

Mo-Sys and Panasonic have combined forces to produce the first PTZ camera head with reliable tracking data. The AW-UE150 brings together Mo-Sys’ innovative StarTracker system with Panasonic’s latest PTZ developments in order to provide producers with a highly versatile camera head for virtual production, for use in both broadcast and film. 

As part of its on-going LiveLab series, Mo-sys and Panasonic jointly demonstrated the capabilities of the AW-UE150 and fielded questions by global attendees from the broadcast, film and corporate video sector. For those that missed it, here are the main takeaways. 

A creative collaboration 

Mo-Sys’ partnership with Panasonic goes back quite some way. 16 years ago, the teams paired up to create AW-PH300 – a pan and tilt system with camera tracking capabilities. Moving with the times whilst pushing the boundaries of production technology, our latest efforts have culminated in one of the most agile VP-enabled camera heads on the market. 

As the latest iteration in Panasonic’s PTZ series, the AW-UE150’s specifications are impressive. Able to shoot in 4K or HD, it has a 20x optical zoom and 75.1 degree wide viewing angle, making it ideal for shooting high-quality video from small spaces. 

Perhaps its most innovative aspect, however, is how it integrates Mo-Sys StarTracker System. Rather than simply linking to separate technologies, the new head combines various units into one – ultimately providing producers with an incredibly simple yet high-quality virtual production system. 

The Benefits

There are numerous advantages to incorporating the StarTracker system within the camera head. Here are the main ones: 

  • Movement – Traditionally, a camera-head used for virtual production is fixed in place, only able to pan and tilt a scene. With the StarTracker system, Panasonic’s camera head can move freely around a virtual or augmented environment, providing producers with unparalleled creative freedom in the studio.

  • Calibration – There’s no need for lengthy calibration processes. Once set up, the AW-UE150 stores calibration data, meaning that broadcasters are able to use their virtual set without continual delays due to recalibration and homing the system.

  • Simplicity – Whether mounted on a jib, crane, or dolly, the camera head can be controlled entirely by a single operator (particularly useful for studios encountering post COVID-19 safety restrictions), or operated remotely. 

The potential of Panasonic’s camera head

Mo-sys’ technology has been used across a range of formats to give broadcasters greater creative freedom and drive engagement with viewers. In the States, The Weather Channel has used StarTracker to produce visually spectacular, immersive weather reports, whilst in the Netherlands, NEP is utilising our tech to enhance live sports and entertainment programs. 

Panasonic’s AW-UE150 is expected to contribute to this push towards virtual set usage, representing an affordable option for small and large studios alike.

Jaume Miro, Project Manager for Panasonic Business, reflects on the potential of the AW-UE150, saying: 

By having a camera tracking system like StarTracker, you do not have to worry about recalibrating the camera, you can move the camera around, either on a tripod, on a Panapod, on a dolly or on a jib. And the result is an immersive content that gets the most of the virtual graphics….StraTracker preserves the simplicity of the AW-UE150 for AR/VR, combining the tracking data from the camera with the absolute spatial position, in a single stream of data. And it does in a form factor that fits perfectly underneath the camera, matching the camera’s color and shape. 


If you’re currently considering virtual production, for either broadcast or film, please don’t hesitate to get in touch. Our expert technicians will be on hand to answer any questions you may have about virtual production, and offer the most viable solution for your project. 

LiveLab Stories: Netick Group and disguise

Exploring LED-Wall Technology for Real-Time Production with Netick Group and disguise

In this LiveLab Stories webinar we explore real-time immersive video environments and how they can transform your live productions.

With insights from Netick Group, disguise and Mo-Sys, we take a closer look at the entire workflow and understanding how our camera tracking, Notch’s real-time rendering engine and diguise’s media server all come together for LED-wall workflows.

Virtual Production for the next generation of filmmakers

Located in the heart of Universal Studios, Florida, DAVE School develops the skills of budding Visual Effects, Production, and Game Design professionals. It draws on the experience of its award-winning alumni as well as the latest developments in production technology to give students a head-start once they enter industry. 

A recent project put together by Dave School and On-Set Facilities, saw students from the Game and VFX programmes collaborate to produce a sci-fi-inspired movie trailer, titled Deception. Shot exclusively with the Mo-SysVP workflow, the 2-minute trailer depicts a high-energy intergalactic battle scene in which the protagonists are ultimately stranded on a deserted, barren planet. 

DECEPTION Trailer:

Trailer release of the first Virtual Production by our GAME and VFX students filmed in real-time alongside On-Set Facilities on our new sound stage at Universal Studios Florida. More to come. #onlyatDAVE #UE4 #talent

Posted by DAVE School on Wednesday, 15 April 2020

Realtime virtual production 

DAVE School students employed real-time virtual production methods in the making of Deception in order to reduce time spent in post-production, and improve the overall quality of the trailer. The entire studio set-up and Mo-Sys VP system was put together by virtual production specialists On-Set Facilities and Mo-Sys’ own virtual production engineer, Marcus Masdammer.

Image Credit: On-Set Facilities

Real-time virtual production relies on various technologies to give filmmakers the ability to shoot and edit virtual or augmented reality scenes whilst on location. Effects, graphics, and animations can all be created before shooting takes place, meaning that producers and actors can actually visualise a near-finished scene during filming and as soon as they’ve completed it on set. This doesn’t only give a greater level of creative freedom to all involved, but reduces production time and costs as post-production editing is minimised. 

Among the technologies required for real-time virtual production are camera tracking and graphics plug-ins

Virtual Production – Unreal Engine

The graphics for DAVE School’s Deception trailer were rendered in Unreal Engine (UE). As one of the world’s most advanced graphics engines, UE is regarded as the gold-standard across the film and gaming industry.

Image Credit: DAVE School

When used in films, however, UE traditionally only enters the picture during post-production; many virtual studios use lower quality, proprietary graphics systems during filming before adding UE later on. 

Mo-Sys VP helps productions to cut out this later editing phase by acting as a direct interface between camera tracking systems and UE. With this technology, producers are able to integrate the power of Unreal Engine into their realtime VR shots, gaining a better impression of the final product and again, saving time and money. 

In the case of DAVE School’s production of Deception, Mo-Sys VP was paired with our own StarTracker system. Using retro-reflective ‘stars’ attached to the studio ceiling and camera-mounted sensors, StarTracker allows cameras to calculate their absolute position in a computer-generated environment. Syncing seamlessly with Mo-Sys VP, StarTracker further enables real-time production techniques and reduces the resources spent on the post-production phase. 

Next-gen virtual film production

DAVE School is dedicated to providing its students with the best preparation to enter the film industry. Whilst drawing on the expertise of its veteran tutors, it also integrates the latest production technologies into its curricula. Nowhere has this been more evident than in the making of Deception. Relying on StarTracker and Mo-Sys VP, students learnt that real-time virtual production not only provides producers with unparalleled creative freedom whilst on set, but also significantly reduces costs and time spent in post-production. 


Mo-Sys is leading the way in camera tracking and virtual production techniques. If your organisation is looking to integrate VR production into its curricula, or you’d like to explore the potential of this technology more generally, please get in touch with our team. 

Product Launch: StarTracker enhanced Panasonic AW-UE150

We are proud to announce the launch of the StarTracker enhanced Panasonic AW-UE150. Rewatch our LiveLab Product Launch below as we discuss the benefits of the enhanced PTZ head.

  • 4K camera, wide-angle lens, PTZ head with absolute camera tracking for AR and virtual studio application
  • No longer restricted to a fixed camera position
  • Be more creative and use the PTZ head on a jib, crane or a dolly for unlimited camera motion
  • Create engaging content with natural depth and changing perspective of the virtual background
  • Supported by all render engines

Featuring insights from Michael Geissler (CEO), Martin Parsley (Lead StarTracker Developer) and Panasonic’s Jaume Miró (Project Manager), they discussed how this innovative collaboration can help you produce immersive and engaging content in any AR or virtual studio application.

Previously planned to launch at NAB, the webinar was co-hosted by Panasonic and Mo-Sys and showcased the new Panasonic AW-UE150 – a 4K, wide-angle lens, PTZ head with absolute camera tracking for AR and virtual studio application. With Mo-Sys StarTracker, the venue or event is no longer restricted to a fixed camera position and can be more creative using the PTZ head on a jib, crane or a dolly for unlimited camera motion.  

Mo-Sys CEO Michael Geissler said, “Although disappointed that we could not launch at NAB, these webinars are a fantastic opportunity to demonstrate this innovative collaboration, helping a wide range of customers generate immersive and engaging content for AR and virtual studio applications. We are proud of our partnership with Panasonic and excited to showcase what is possible.”

Mo-Sys VP Free and VP Pro Features

With Unreal Engine 4.25 released and available for download, we’ve updated our UE4 plugins to make creating virtual production content easier than ever before.

We now have two plugins available; Mo-Sys VP Free and Mo-Sys VP Pro. Both our plugins use LiveLink to stream data from all Mo-Sys robotics and tracking systems directly into the engine. Below you will find you the full range of features our plugins offer so you can choose the most suitable system for your virtual production.

Mo-Sys VP Free is now available on the Unreal Engine Marketplace.

To download a free trial of Mo-Sys VP Pro or for more information please contact sales@mo-sys.com.