Mo-Sys demonstrates Cinematic XR Focus with PRG

Recently, Mo-Sys joined forces with PRG for an exclusive evening mixer at their newly renovated Virtual Production Studio in Los Angeles.

As a partner of the event, Mo-Sys brought some of its Virtual Production toolset for a VP demonstration geared toward a DP audience. Executive Vice President – Sales & Operations – Americas, Tom Shelburne demonstrated Mo-Sys’ Cinematic XR Focus and NearTime re-rendering in the LED volume along with the Mo-Sys U50 heavy-duty remote head.

  • Cinematic XR Focus allows for the seamless focus pull from real foreground objects to virtual assets placed deep within the scene – that would appear to be positioned behind the LED wall. This is achieved using the same wireless lens control system already in use today.
  • NearTime is Mo-Sys’ patent-pending and HPA Engineering award-winning solution for solving real-time VFX virtual production image quality.

PRG selected partners for the event that integrated well with the existing technology in their studio. Mo-Sys’ Virtual Production solutions are designed specifically for final pixel XR shooting and pair perfectly with PRG’s Enhanced Environments.

See the demonstration below:

‘I love the industry’s appetite for change’

Paul Clennell, Chief Technology Officer at dock 10, discusses recent changes in the media tech industry with TVB Europe for their July/ August issue. He also shares his view on the biggest topic of discussion in his area of the industry which has been mainly focusing around Virtual Production.

We’re very focused on the next generation of virtual production, using the latest real-time games-engine technology from Epic and Unreal Engine with Zero Density, together with the Mo-Sys StarTracker system, and encouraging entertainment formats to adopt these exciting technologies more widely over the next few years. There is a continued pressure to reduce budgets, but channels still demand new and exciting content, and this is where virtual studio elements can make a creative difference.

Paul Clennell, Chief Technology Officer, dock 10.

Read the full article here (Pg 34) >

Virtual graphics for world-class horse racing coverage

Woodbine Entertainment has recently gone on air with RT Software’s Tactic Live for their live world-class horse racing coverage. They are now able to show a range of virtual graphics which are rendered live over their broadcasts.

The new live virtual graphics include furlong distance markers and sponsor’s logos which are added to enhance the action for Woodbine’s large and dedicated audience around the planet.

Andrew Barnhardt is Woodbine’s Director of Technology Operations, and was heavily involved in the project. He said, “we’re always looking for innovative new ways to showcase our sport and we have a history of developing new technologies and introducing to our sport existing technology used elsewhere to enhance our sports productions. We tried out several different broadcast virtual graphics products over the years, but they could not deliver the quality and consistency that we needed. We had been talking to RT Software for a while, and we were impressed with the performance of their solution. They were focused on demonstrating their value to us and ultimately we selected them as a partner, and we’re happy that we did.”

“We have some unique technical challenges when it comes to adding virtual graphics over our live video; a large outdoor field of play, frequent and large variances in light intensity and shadows, four seasons of weather (including snow and fog) and fast-moving subjects which are often the same colour as the surface they are running on. RT Software impressed us with how well their system coped with all of these challenges. Tactic Live delivered what they said it would and we are very pleased with the results.”

“No system exists in isolation, so it was important that RT Software worked well with our existing camera heads and that’s why we brought in the Mo-Sys camera tracking system, which has worked flawlessly. The relationship and integration between all three parties went very smoothly. Setting the system up went well, and we had great support from RT Software whenever we needed it, both hands-on and remotely.”

Since the broadcasts have gone live Andrew says they have had lots of comments, “It’s been great to receive so much immediate, positive feedback from our peers in the industry. They all have similar challenges and so they were keen to know who provided the technology. We were happy to recommend RT Software to them. But perhaps more importantly, it’s the reaction of our audience that matters. Especially for newer fans to the sport the graphics provide an extra piece of data and insight, which fans are used to seeing across other professional sports. They love it!” The project has gone so well for Woodbine that Andrew says they are now considering other possibilities.

Andrew continues, “Next we are working on other ways that we could use this technology in our productions. We’re also considering further expansion across both of our facilities and adding graphics synchronised across multiple cameras.”

“RT Software has other broadcast graphics products that we are interested in. Their focus on both value and quality certainly opens up more options for us. We are looking forward to a long and fruitful partnership with RT Software as we continue to introduce innovative solutions to the sport.”

Mike Fredriksen, RT Software, Commercial Director said, “It has been a great privilege to work with such a prestigious company as Woodbine Entertainment. We know our technology has significant advantages compared to some other broadcast graphics vendors, so it is very satisfying to receive Woodbine’s validation of our approach. We look forward to other exciting opportunities to work with Andrew and the Woodbine team.”

Mo-Sys Academy Students Set for Promising Careers

Mo-Sys Academy students set for promising careers as short film wins multiple awards

Three talented Mo-Sys Academy students look set for promising careers as their short film, Balance wins multiple awards at the University of Greenwich’ BAFTA styled film and television GRAFTAs.

Mo-Sys Academy invited students from universities to attend the Virtual Production Practical Summer School 2021, an intensive six-week course designed to introduce students to virtual production through hands-on practical learning with the latest technology.

Set on the Mars Desert Research Station in Utah, Balance follows the journey of trainee astronaut, Ben who is working to complete training and follow in his father’s footsteps. The production used a mix of traditional and virtual production with the team utilising Mo-Sys Academy studios to film the virtual scenes in just two days.

Mo-Sys Academy students set for promising careers

Mo-Sys Academy students set for promising careers as Balance wins multiple GRAFTA Awards from the University of Greenwich

I massively enjoyed working on Balance since I was able to explore the process of virtual production in detail with a crew I trusted. As a cinematographer there are a lot of new things you may need to know, to prepare yourself for the shoot. Virtual production allows you to set up a scene in the virtual environment even before getting to the studio. All that’s left during the production process is trying to imitate that light in front of the green screen. The experience I gained during the Mo-Sys Academy training was crucial when working on Balance. When making the shot list I already knew what things would look like and what scenes would be more complicated to film than the others in that way saving time on the set.

Director of Photography, Emils Lisovskis

Led by Juliette Thymi, a senior VP Technician and experienced Virtual Production Producer who has worked on projects for Netflix, ITV and BBC. Mo-Sys’ Academy aims to build students confidence and provide valuable experience, priding itself on a friendly, collaborative learning environment for all skill levels.

During their time with Mo-Sys Academy the team worked through a practical exercise set on a desert alien planet and this would become the forerunner to ‘Balance’. Judged by a panel of recognised industry professionals, Balance picked up nominations for all categories, winning four of them; Best Director, Best Producer, Best Production Design and Best Sound Design.

We are incredibly proud to see three former Academy students following a path of Virtual Production and winning awards so quickly. I am confident the team have fantastic careers ahead of them.

Juliette Thymi, Mo-Sys Academy

Since attending the Academy, the team have stayed in touch, not just for ‘Balance’ but bringing more ideas and collaborating with Mo-Sys on projects like the TEDx University of Greenwich VP event.

Mo-Sys Academy Students collect GRAFTA Awards for Balance
Mo-Sys Academy Students collect GRAFTA Awards for Balance

When I got accepted at Mo-Sys Academy 2021, I knew there and then that I would use Virtual Production to create my film. In the 2 weeks course I learned all I could so that I would be able to tell a story that could take place in an exotic environment, furthermore, the short film we’ve done at the end of the Academy served as a great proof of concept to how Balance could turn out. 

We had little experience in VP apart from the Mo-Sys Academy, however when I created the story I had I mind the fact that I would like to mix traditional filmmaking with virtual production, so I could benefit from the best of both worlds. I planned on using VP to recreate the desert, for the interiors I planned on using real locations. This created the basis and offered us familiar territory to play with considering that in the Academy exercise our story took place on a desert alien planet, basically using the exercise and what we learned in our favour so that this time we could make bigger and better.

It was an overwhelming feeling finding out that our film was nominated for all categories at the GRAFTA awards. I have to give credit where credit is due, I wouldn’t have been able to achieve the film Balance or produce the TEDx University of Greenwich VP event without the help and knowledge of my crew members Emils Lisovskis and Eduard Fadgyas. Also, none of these projects would have been possible without the teachings from the Mo-Sys Academy, led by Juliette Thymi and Dom Smith. Thanks guys!

Director/VP Supervisor & Producer, Jean Ciuntu

Mo-Sys Academy has announced new course dates in the UK and Los Angeles. Spaces are limited and demand is expected to be high. Visit Mo-Sys Academy for more information and to book your place.

Mo-Sys Hosts Regional Virtual Production Learning Zone at MediaCity

Mo-Sys ran the successful Virtual Production Learning Zone at KitPlus Show, MediaCity, on Thursday 23 June 2022. The event underscoring Mo-Sys’ deep commitment to share knowledge, collaborate with universities and training the next generation of industry professionals.

Led by Juliette Thymi, Mo-Sys Academy’s senior VP Technician and experienced Virtual Production Producer who has worked on projects for Netflix, ITV and BBC, such as Strictly Come Dancing, the free taster sessions gave visitors insight as to the technology and techniques used in Virtual Production, while highlighting a proven development path for those who’d like to learn more.

The full range of innovative Mo-Sys VP solutions were on display, including StarTracker, the industry standard for precision camera tracking; VP Pro XR, a dedicated XR server that has been built specifically for Cinematic XR on-set real-time production; the unique Cinematic XR Focus feature that enables pulling focus from talent to virtual objects deep within an LED volume; and Multi-Cam for seamless multi-camera switching.

Nicholas Glean, Senior Lecturer in Video and New Media at the University of Sunderland recently completed the Mo-Sys Academy 10-day Foundation Course and commented on the importance of engagement between industry and the education sector:

Nicholas Glean – University of Sunderland – The importance of Virtual Production

Mo-Sys is actively driving engagement with universities such as Sunderland, Greenwich and Ravensbourne to ensure we have the right skills coming through to meet the surging demand for VP, but also offer the next generation access to exciting career opportunities.

Mo-Sys Academy with Virtual Production Training at KitPlus Show MediaCity 2022

During the show, Mo-Sys hosted a panel discussion titled The Future of Virtual Production Training, featuring Kieran Phillips of CJP Broadcast Service Solutions, Nicholas Glean from University of Sunderland, and Adam Soan of Bendac. The session provided insight into overcoming the skills gap and maximizing the opportunities of LED Virtual Production for Broadcast. Watch the seminar in full below:

The Future of Virtual Production Training – KitPlus Show MediaCity 2022

Alistair Davidson and a team from Scan Computers attended the workshop and said it was “a really fantastic insight for anyone whose interested in Virtual Production.”

Scan Computers talks about the Virtual Production training workshop at KitPlus Show

Rizwan Wadan from Pixeleyed Pictures summarised the KitPlus event adding: “These sort of events are amazing. They help you understand what is going on and I’d highly recommend that whether you’re a student, lecturer or professional to attend.”

Rizwan Wadan from Pixeleyed Pictures gives some KitPlus Show feedback!

Mo-Sys Academy has announced new course dates in the UK and Los Angeles. Spaces are limited and demand is expected to be high. Visit Mo-Sys Academy for more information and to book your place.

What is Virtual Production

What is Virtual Production, and why would I use it?

Mo-Sys Engineering’s Philippe Vignal uncovers the exciting opportunities and what brands need to understand about virtual production.

What is Virtual Production and Why would I use it?
What is Virtual Production and Why would I use it?

For decades we’ve been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalise the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process. Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient. But it requires more preparation.

The enablers for virtual production are real-time photo-realistic graphics engines (e.g. Epic Games’ Unreal Engine, or Unity’s 3D engine) and precision camera and lens tracking. Using these technologies means that we can put talent or products into any photo-realistic environment we like, whenever we like, even change the environment at a flick of a button.

Here’s the bottom line. Brands, both big and small, are currently looking at ways to reduce cost and increase the speed at which they are producing content, because human eyeballs have become insatiable and require being fed on a daily basis.

Flying all over the world to different sets across numerous locations and using a plethora of resources as well as a large roster of talent is no longer cutting it. The majority of brands want to move away from this model. And it’s virtual production tools that are paving the way.

Technology advancements mean that we are now able to immerse talent in previously impossible environments, or environments that would be very, very expensive to build with physical sets. And we can achieve this in a highly seamless fashion, in a very cost effective way through virtual production.

No longer tied to the laws of physics, you don’t have to chase that golden hour any more for the perfect shot. You can have that golden hour last all day if you need it to. You can have true flexibility.

In traditional VFX production, you create your finished 3D scenes or assets ready for post-production. In virtual production your finished 3D assets are created in pre-production, ready to be used in full quality for production, meaning that post production compositing is either removed completely or significantly reduced and optimised. The net gain is that the overall time used to create the finished commercial is reduced.

Above all, virtual production is a craft that allows brands to tell their stories in new and enticing ways. And as more and more creators adopt the technology, now is the time to get to grips with what it entails and understand exactly how you can harness this tool as a brand.

Bring the Set to Life with LED Volumes

Virtual production is all about real-time VFX, and this can be done using a green or blue screen studio, with no screen (for augmented reality), or by using an LED volume which is the latest rage across the feature film and TV sectors.

Shooting in an LED volume involves capturing in-camera visual effects (ICVFX) shots of talent or products shot against photo-realistic 3D graphics displayed on the LED wall. The virtual graphics displayed on the wall accurately change in real-time providing the correct perspective view to match the tracked camera and its lens settings.

Everyone is asking whether they should be shooting in an LED volume instead of on location. But we must be careful – it is not a cure all.

Sometimes, location shoots are meant to be the solution. Sometimes shooting in an LED volume is going to be the solution. Sometimes you’ll need to combine an LED volume shoot with a green screen shoot. So brand marketers need to work with content production facilities that are able to offer that set of tools and the knowledge that comes with properly using those tools for virtual storytelling.

When you’re on set shooting virtual content on an LED volume, there’s always a compromise between the complexity of the virtual scene and the power of the workstations and graphics cards that are having to render it in real time so that the camera can capture the VFX shot in-camera in real-time.

Technology has certain limitations with regards to the ability of graphics cards to calculate in real time, so in order to maintain real-time playback frame rates on set, sometimes you need to compromise by reducing the quality of your virtual elements. So background elements that you are immersing your talent into aren’t going to be as high quality as you would want them to be, which is exactly what our patented NearTime rendering system is designed to solve.

During a real-time virtual production shoot with NearTime enabled (LED or green screen), as soon as a take starts, the camera tracking data and lens data are automatically sent to the cloud where the same virtual scene exists on multiple servers.

Using the tracking data, and with all the quality dials now turned up, the virtual scene is re-rendered and returned to set or to wherever shots are stored. Here, it is combined with the separated talent or product and the composite shot made ready for approval. The whole process is completely automated and costs for re-rendering are minimal. Importantly, NearTime enables a real-time virtual production shoot, with higher quality images, to be delivered in the same time frame window.

Shoot for Global Markets from Just One Location

One of the biggest benefits of virtual production is that you don’t have to be on location for shooting content. Not only that, but you can very easily change your virtual location background and use 3D Unreal based elements to help in versioning work.

Take brands such as P&G or Unilever who often need to create large amounts of versions for one product. With virtual production, they can shoot a piece of content for their European markets in one background, then change the location and product label virtually to shoot for their Asian and American markets – all from one place during the same shoot.

You can also change the actual product – the packshot – and map on different local labelling, perhaps using the same actor to do multiple language versions through artificial intelligence.

Suit All Budgets

With developments in tech, virtual production is now more accessible than ever. We now see certain phones built with the capability to LIDAR scan real objects to import into a 3D environment. Only a few years ago, this required very specialised tools but now it’s contained inside of a handheld device. This accessibility of tech has brought the cost down noticeably.

This is also true of software. We’re now well versed in software that used to be reserved for the gaming industry – things like Unreal and Unity game engines that have become ubiquitous in the sphere of advertising, feature and broadcast content production.

You can now also generate your own avatar to be immersed into virtual environments and interact with real elements. This would have taken a lot of production work a few years ago, and now it is much more readily accessible to a wider audience.

With the onset of this advanced technology packed into handheld devices, creators like bloggers have even started to adopt virtual production to combine with real-time motion capture. This is made possible through products like Mo-Sys’ StarTracker Studio, the world’s first all-in-one virtual production system for green screen production which comes with all the necessary cameras, trackers, software, and hardware to do virtual production, even where studio space is limited.

So, How Should A Brand Prepare for A Virtual Production Shoot?

At the base of all brand communication is creativity. And at the base of all virtual production is creativity. That will never change. Virtual production is not a cure all or a way to cut corners. If your content is devoid of creativity, it will be as barren in a virtual production environment as it will be in a live production environment. So you need creative people on your team that are technologically versed in Unreal and who understand production.

We are now seeing an increasing presence of hybrid profiles – creative people who also have technical knowledge. They understand Unreal and how it can be used in a production environment. They understand film or broadcast. They need to be curious enough to have trained in film and storytelling in these sorts of environments.

It is also important to remember that virtual production is a team sport. You cannot achieve it on your own, just as you cannot achieve live production on your own. It is a craft. Brands need people that understand virtual production in the same way as they need people on their teams that understand digital marketing. They both address the increased speed of content production needed.

Work with highly trained, highly skilled, curious individuals that can help you understand what’s entailed and who will help you liaise with either your own production departments in-house, or with your production and creative agencies.

Whatever you do, be well informed. Do not rely blindly on external knowledge. Everyone needs to be well informed of what the technical requirements are with virtual production. Work with consultants such as the ones we have at Mo-Sys who can help you navigate this new world of storytelling and technologies.

We are seeing a rapid adoption of virtual production tools. In London alone, there are over a dozen large LED volumes being currently built. We’re going to see increased use of virtual characters, avatars, AR content, entertainment, hybrid use of real-life humans interacting with virtual humans.

People are curious by nature, and we will always try to explore new things. Right now we’re fascinated with being able to reproduce ourselves as meta humans in the virtual world. Where that’s going to take us I don’t know but I do hope we keep our common sense whilst doing this and not forget our humanity and that the real purpose of this is to connect with other humans, not to isolate ourselves from the rest of humanity.

Curiosity combined with storytelling will be the focus over the next thousand years or so.

Read the full article published by Little Black Book March 31, 2022

Mo-Sys offers core VP Pro features in new ‘Free for Life’ download

Mo-Sys VP Pro

Mo-Sys VP Pro is the complete solution for Virtual Production with a full suite of creative tools which embed directly in the Unreal Engine for augmented graphics, virtual sets and mixed reality. It is a versatile Virtual Production solution that is utilized by a growing pool of filmmakers, high-end broadcasters, and live event media companies. VP Pro is a real-time compositor, synchronizer, keyer and recorder (for video and tracking data), that uses real-time camera and lens tracking data to create all types of Virtual Production content.

Mo-Sys VP Pro captures live tracking data from all Mo-Sys robotic and tracking hardware such as StarTracker and RoboJib.

It has a clever set of tools that simplify and enhance virtual production workflows, making it simpler to achieve the virtual production illusion you’re after.

Strictly Come Dancing Mo-Sys
Mo-Sys StarTracker and VP Pro “perfect combination” for Strictly

“We found that Mo-Sys’ StarTracker technology and VP Pro plug-in was the perfect combination of hardware and software to bring AR to life on Strictly Come Dancing

David Newton, Managing Director of Potion Pictures

Core features of VP Pro are now available via free download. Enhance your virtual production workflow today.