VP Pro 5.3.2.2 Minor Release

VP Pro 5.3.2.2 Released

Mo-Sys has today announced the minor release of VP Pro 5.3.2.2

James Uren, Mo-Sys Technical Director “I’m very excited in this minor release to be opening up our lens library to our VFX and XR VP Pro customers. I hope this will enable VP teams who want to use Unreal native to tweak pretty much any lens and achieve great quality lens distortion. If you can’t find a lens in the library (or one that is similar enough to tweak) do reach out to support@mo-sys.com and we’ll do our best to create a base file for everyone to use.”

Release notes:

  • Entire base lens file library made available to all users (VFX and XR license)
  • Machine-locked licensing (internet not required) (VFX and XR license)
  • External recorder and camera manager improvements (VFX license)
  • Lens Tweaker usability improvements
  • MultiviewXR multi-camera improvements (XR license)
  • Distortion and undistortion ST maps in EXRs (VFX license)
  • NearTime rendering support for UE 5.3 (VFX license)

Download VP Pro 5.3.2.2 now

Mo-Sys Engineering Democratises Virtual Production with Free Lens File Library 

London, UK – January 26, 2024 – Ahead of ISE 2024, Europe’s largest AV and integration show, Mo-Sys Engineering announces a paradigm shift in virtual production (VP) accessibility: the company is transforming its Lens File Library, the foundation for achieving photorealistic results. This move sees lens files becoming freely available to Mo-Sys’ customers and sets the stage for a trade-in pay-it-forward VP community, empowering creators with a dynamic knowledge base of free lens files and fostering collaborative advancements. 
 
“Accurate lens simulation is the cornerstone of stunning VP,” explains Michael Geissler, CEO of Mo-Sys. “Each lens has its own unique properties and replicating that within the virtual camera is crucial for believable results. Traditionally, this required starting with a paid library lens file and then painstakingly tweaking it to match the real-world lens being used. Now, we’re removing that barrier to entry.” 

By making its Lens File Library freely accessible, Mo-Sys is actively sparking a collaborative VP ecosystem. Creators can now contribute to, and tap into a growing pool of lens files, eliminating the need to start from scratch. They are also encouraged to return newly tweaked files back to the library, nurturing a shared resource and accelerating the democratisation of VP across the industry. 

This commitment to community empowerment extends beyond the Lens File Library. The availability of Mo-Sys’ training, remote and on-site support services remains unchanged, ensuring creators have the knowledge and tools to navigate the VP landscape. The company is also further developing its Lens Tweaker tool to drastically streamline the lens-matching process, making it even more accessible and intuitive. 

“ISE 2024 is the perfect platform to unveil this transformative initiative,” adds Geissler. “We’ll be showcasing cutting-edge VP solutions that harness the power of our lens files. We invite everyone to join us at ISE and experience the future of VP, together.” 

To make an appointment to visit Mo-Sys at ISE or to access the Lens File Library, contact your Mo-Sys Sales Manager or make an enquiry at info@mo-sys.com

ISE 2024 Virtual Production Panel

Don’t miss our ISE panel discussion on the AOTO booth (Hall 3, 3D200) @ 2:30pm, Thursday 1st Feb.


The session will uncover the key players of a successful Virtual Production, and we’ll hear from industry experts who will explore the essential workflow components.

Panel host & location:
AOTO, Hall 3 Booth 3D200

Speakers:
Florian Gallier – Mo-Sys
Patrick Goodden – Brompton Technology
Tommi Rosnell – Mediatrade
Rory Fraser-Mackenzie – ETC

Coffeezilla takes a look at StarTracker Max

Coffeezilla a is successful investigative journalist who has amassed an impressive 3.24 million YouTube followers. Fans regularly tune in to see his work to uncover scams, fraudsters and fake gurus.

Having tried other tracking solutions, Coffeezilla recently turned to Mo-Sys. Take a look at his testimonial to learn about his journey and hear his thoughts about StarTracker Max.

Coffeezilla – Investigative journalist

Check out Coffeezilla on YouTube and if you are building a YouTube channel, and are curious to learn more about Virtual Production and Camera Tracking, please drop us a line at info@mo-sys.com

Mo-Sys NearTime® delivers cost-efficient VFX for Dr Who

Mo-Sys’ NearTime® delivers cost-efficient VFX solution for Dr Who 60th Anniversary Special

London, UK – 08th December 2023 – Mo-Sys Engineering, a world leader in image robotics, virtual production and remote production solutions, today announced that NearTime, its automated re-rendering service for on-set Virtual Production has been utilised to deliver stunning VFX scenes for the Dr Who 60th Anniversary Special at a fraction of the cost.

Mo-Sys NearTime ® delivers cost-efficient VFX solution for Dr Who

Painting Practice are an award winning boutique studio known for giving filmmakers the freedom to develop their ideas and extend creative possibilities, together with leading CGI, animation and VFX specialists RealTime approached Mo-Sys in 2022 to together form a new cost-efficient workflow for a particularly VFX-heavy special episode of Dr Who.

James Uren, Mo-Sys Technical Director said: “This ambitious diamond anniversary Dr Who special was set to have more than 250 VFX shots, meaning traditional VFX approaches would be cost-prohibitive. So, we had to think differently.”

Painting Practice use Unreal Engine to create animated pre-visualisations of complex VFX sequences. Could this be extended to whole scenes or a whole episode? Could we take those pre-visualisations and bring them onto set with the real camera? Could we use Unreal Engine right up to the ‘Final Pixel’, automatically re-rendering what had been filmed on set?

RealTime and Mo-Sys had recently collaborated on Netflix production Dance Monsters, where 6 cameras and 8 monster characters were combined in real-time – also using the Unreal Engine and Mo-Sys VP Pro, to film it as a live light-entertainment show. As part of this Mo-Sys and RealTime developed a pipeline for transferring the precision camera and lens tracking data from Mo-Sys StarTracker through to post-production, to help automate and dramatically speed up the VFX workload.

In parallel and in partnership with AWS, Mo-Sys had built and patented its NearTime solution. NearTime offers a dual workflow which enables automated Unreal re-rendering in the cloud. Tracking data is re-rendered with background plates and delivered back with increased quality and/or resolution in the same VFX delivery window, while making on set renders available for real-time feedback.

Putting pre-vizualisation, on-set camera tracking, real-time pre-viz, NearTime rendering and automated VFX pipelines together meant this VFX-heavy special could be completed with stunning visuals throughout at a fraction of the cost of traditional VFX.

About Mo-Sys Engineering
Mo-Sys Engineering is a world leader in image robotics, virtual production and remote production solutions. The company’s products are used by leading broadcasters, filmmakers, and live event producers around the world. Mo-Sys is headquartered in the UK, with offices in the US, Europe, and Asia.

Mo-Sys and Erizos combine to power NBC Sports broadcast

Mo-Sys and Erizos combine to power interactive virtual table for major NBC Sports broadcast

The Tour de France is one of the world’s most prestigious cycle races with its route crossing France and traversing neighbouring countries. In 2023, the competition attracted a total audience of 42.5 million on French television alone, and millions more around the globe.

NBC immersed its viewers with a new, innovative commentary solution from Erizos. The Virtual Table gave a new perspective, elevated with improved rider stats and analysis through the use of real-time data driven graphic animations which conveyed the energy and excitement of the race.

Mo-Sys and Erizos combine to power NBC Sports broadcast

The solution showcased new capabilities of the Virtual Table, now able to deliver content with enhanced speed, precision, and realism while incorporating advanced shading and lighting techniques that parallel the capabilities of Unreal Engine.

Mo-Sys’ VP Pro integrated tracking data from StarTracker with lens distortion and provided Fill and Key signals for external compositing in the studio’s switcher.

Mo-Sys and Erizos combine. This image showing presenters in the studio using Erizos Virtual Table during commentary of the Tour de France

Gaussian Splat: A New Era for Virtual Production?

Mo-Sys Engineering, a leader in virtual production innovation, is experimenting with Gaussian Splat, a revolutionary scene creation technique that has the potential to transform the way photorealistic content is produced.

Virtual production is entering an exciting new phase, where the creation of production-ready photorealistic 3D content is being radically simplified.

Previously photogrammetry looked to have solved this problem but was ultimately found to be too complex. Then came NeRF, a smarter way of generating 3D environments from video footage with the help of AI. This technique did simplify the capture process but introduced impractical render delays.

Gaussian Splat is enabling a new, simple and rapid scene capture process without AI, and the performance sapping issues associated with NeRF.

Born out of a European Research Council funded project and pioneered by Bernhard Kerbl and Georgios Kopansas of the University Cote D’Azur, Gaussian Splat is making it possible to create photorealistic scenes quickly from a variety of sources, including drone footage, handheld cameras, and even smartphones.

Mo-Sys is now experimenting with Gaussian Splat and has already found several ways of implementing it for use inside Unreal Engine. This is the beginning of an exciting journey and Mo-Sys is committed to sharing details of our learning.

Take 2 Studios select StarTracker Max

Take 2 Studios select StarTracker Max for new, state-of-the-art virtual studios

Salzburg, Austria – 14th October 2023 – Take 2 Studios select StarTracker Max for new, state-of-the-art virtual studios. Located in the heart of Europe, Take 2 Studios objective is to provide clients with an agile, cost-effective, reliable and sustainable virtual production environment.

Take 2 Studios select StarTracker Max
Mo-Sys supports Take 2 Studios during grand opening event, Salzburg, Austria

Founded by sibling team Viktoria and Felix Brandstetter, who share a deep passion for creative, innovative and collaborative working, Take 2 Studios has already completed its first feature-length TV commercial project for one of the world’s leading automotive brands. The team has also quickly established a strong pipeline of opportunities underlined by a successful grand opening which attracted global interest.

Viktoria Brandstetter explained “I’ve always been involved with graphic design and marketing. Felix started out as an audio engineer before a love of technology and storytelling drew him to film, and ultimately Virtual Production. We regularly found ourselves collaborating on projects and always wanted to work together in a bigger way. We’re excited to combine our strengths, learn and grow together with our team, while delivering amazing projects for our clients.”

Take 2 Studios Viktoria Brandstetter speaks to Mo-Sys' Stephen Gallagher
Take 2 Studios co-founder Viktoria Branstetter speaks with Mo-Sys’ Marketing Director, Stephen Gallagher


StarTracker Max sits at the core of Take 2 and is instrumental in bringing clients’ virtual production ideas to fruition. Studio 1 features an 8m x 3m curved LED volume while Studio 2 offers a large cyclorama wall.

“We are committed to providing clients with the best possible experience,” added Felix Brandstetter, co-founder of Take 2 Studios. “An important step to delivering this was the careful selection of equipment. We simply couldn’t afford to make mistakes, so took our time in researching solutions and meeting manufacturers. These relationships matter, and we quickly felt a strong connection with Mo-Sys, so camera tracking became one of the easiest choices.

With its new browser interface, StarTracker Max was super easy to set-up and has performed brilliantly ever since. Our engineers love the rock-solid performance, small design and that there is no external PC.

From a business perspective, we will be able to access extra features soon, and don’t need to re-home or regularly tweak calibration. That means maximum studio up time, happy clients and a future-proof investment.”

Philippe Vignal shows StarTracker Max at Take 2 Studios, Salzburg
Philippe Vignal shows StarTracker Max during Take 2 Studios opening event, Salzburg, Austria


There is also a profound commitment to sustainability running through Take 2 Studios. This has influenced everything from marketing activities to investment in smart power management solutions, the use of solar power and more.


Luiza Maddalozzo, Head of Sustainability added “We wanted to do more and go above the well documented eco benefits of Virtual Production such as cutting CO2 with reduced travel – although that is a major plus for VP. We’ve gone further and have, for example, created a partnership with a local relocation and second-hand goods organisation who can provide props, creating a reuse network. So, when we need a washing machine or a sofa, we bring it in temporarily, and afterwards it goes back to the retailer.”

Take 2 Studios select StarTracker Max
Take 2 Studios select StarTracker Max

Mo-Sys Collaborates with NVIDIA

Mo-Sys Collaborates with NVIDIA to Reveal Next-Generation Broadcast Technologies at IBC 2023

LONDON  – 13th September 2023 – Mo-Sys today announced a collaboration with NVIDIA to drive the next generation of broadcast technologies for virtual production and extended reality (XR) applications.

The collaboration will strengthen Mo-Sys’ successful VP Pro XR virtual studio product offering with the introduction of MultiViewXR while accelerating the development of its worldwide remote production solution, TimeCam.

Using Mo-Sys’ patent-pending machine learning, MultiViewXR, powered by NVIDIA RTX™ 6000 Ada Generation GPUs, provides a complete multi-camera switching solution for LED virtual studios. It automatically keys, recomposites and generates real-time, AI-assisted director previews of off-air cameras.

NVIDIA Mo-Sys IBC 2023

The system has been designed to remove complexity from the technology and leave creatives to focus on what creatives do best, with full control of shutter speed and exposure, as well as complete freedom of choice of camera, LED controller and panels.

Michael Geissler, CEO of Mo-Sys Engineering, said, “Our innovative approach and engineering expertise allow us to create a new generation of tools that will define the future of broadcasting and filmmaking with the help of real-time AI inference on NVIDIA GPUs.”

For the first time, virtual studios can access an affordable multi-camera switching solution that also provides directors with the correct perspective view from off-air cameras before switching within a limitless XR environment.

“NVIDIA helped us hugely with improving the pipeline and performance of MultiViewXR,” said James Uren, Mo-Sys Technical Director. “We are only scratching the surface of what’s possible with NVIDIA RTX, and will continue collaborating to accelerate our machine learning pipeline and AI-based features.”

Mo-Sys will demonstrate MultiViewXR together with Sony FR7s from stand 7.C16 during IBC 2023. The system will also be featured on the LG booth with the possibility to switch between four RED Komodo cameras with four Fujinon lenses, tracked by the new StarTracker Max.

A new version of Mo-Sys TimeCam patented technology will be shown at IBC as well. TimeCam allows worldwide camera control from one point to another anywhere in the world with zero delay felt thanks to Mo-Sys’ patented delay compensation. TimeCam is now compatible with SMPTE ST 2110 workflows and NMOS support for Standardized Discovery and Routing.

Mo-Sys TimeCam

“By working with NVIDIA, our combined expertise allows us to provide the most advanced solutions for film and broadcast,” said Florian Gallier, Strategic Partnerships Manager at Mo-Sys. “We are proud to have pushed the development of TimeCam, and we are compatible with full IP workflows and SMPTE ST 2110 using NVIDIA Rivermax, NVIDIA RTX 6000 Ada and NVIDIA BlueField DPUs.

Gallier continued, “These new workflows will leverage the latest 8K sensors through our partnership with RED Digital Cinema. TimeCam is compatible with RED Connect, which decompresses the camera’s 8K stream using NVIDIA CUDA cores into full- fidelity images used by TimeCam to compensate for latency in remote camera operations.”

TimeCam integrates with RED Connect, with SMPTE ST 2110 based on NVIDIA Rivermax, and with RED’s V-RAPTOR camera streaming 8K video over IP directly out of the camera through fibre. This allows a 4K delay compensation by cropping in real time within the 8K frame while combining it with the robotic head on the other side of the globe. This patented technology allows real-time worldwide camera control with no delay felt at all.

TimeCam will soon be available on NVIDIA’s software-defined platform for building and deploying media applications. It will be demoed at the Mo-Sys booth at IBC in collaboration with NVIDIA and RED Digital Cinema.

About Mo-Sys Engineering

Mo-Sys Engineering is a world leader in image robotics, virtual production and remote production solutions. The company’s products are used by leading broadcasters, filmmakers, and live event producers around the world. Mo-Sys is headquartered in the UK, with offices in the US, Europe, and Asia.