Mo-Sys and GMS International ink partnership deal

The two companies will meet the surging demand for Virtual Production technology in South Korea.

Mo-Sys Engineering today announces that it has signed a partnership with South Korean dealer GMS International to make its cutting-edge augmented reality and Virtual Production solutions available to cinematographers and broadcasters in the country. The agreement will help to meet growing demand for Virtual Production in Korea’s dynamic media market, one of the leading early adopters of Virtual Production. 

GMS International has over 20 years’ experience of supporting broadcasters and media organisations in the South Korean market from its headquarters on the outskirts of Seoul. The GMS team has extensive knowledge of working with some of the major names in the Korean media world and delivers consultancy, installation, and system design services. 

Mo-Sys has been a driving force in the use of realtime augmented reality and virtual studios in broadcast, and in the growing use of LED walls and LED volumes for movie production and for live events. Its unrivalled StarTracker system embodies Mo-Sys’ expertise in camera robotics and the company delivers complete end-to-end systems through its VP Pro XR media server.  

“South Korea is recognised as one of the markets driving forward Virtual Production adoption and as a pioneer in this area, Mo-Sys has the tools and technology to meet the rise in demand,” said Michael Geissler, CEO of Mo-Sys. “The combination of GMS’ knowledge of the market and our own expertise of Virtual Production makes for a strong synergy and together we can match the needs of customers to the right tools that allow them to create amazing content in new and exciting ways.” 

The agreement with GMS will see Mo-Sys StarTracker and the VP Pro XR system made available to customers in Korea. Mo-Sys technology integrates seamlessly with major LED volume providers, including ALFALITE and addresses some of the major challenges that arise with live Virtual Production and LED volume integration. 

Hyungjun Kim, CEO of GMS International commented: “Mo-Sys’ unique technology and solutions meet the Virtual Production needs of the high-end content producers here in South Korea better than any other provider, allowing customers to express their creativity with no constraints. We look forward to a successful and fruitful partnership with them”.

CJP Commences Technology Refresh for University of Sunderland Media School

CJP Broadcast Service Solutions, a leading provider of products and services for the broadcast and wider television industry, announces the start of a major project for the University of Sunderland.

CJP Mo-Sys
Mo-Sys StarTracker

“This latest venture comprises four elements,” comments CJP Broadcast founder and Managing Director Chris Phillips. “Each element will enable the Faculty of Arts & Creative Industries to teach the very latest techniques in virtual production, TV production and outside broadcasting.

“The first project element will be the provision of a curved LED volume incorporating the latest Mo-Sys LED technology and Bendac InfiLED 1.9 mm LED panels. The faculty will use a Mo-Sys VP Pro XR LED content server to drive the LED volume, featuring seamless Multi-Camera switching, Cinematic XR Focus for pulling focus between real and virtual objects, and Set Extensions. Tracking will be provided by a Mo-Sys StarTracker.

“The second project will be the upgrade of an existing TV studio. It was important for the faculty to upgrade its legacy production infrastructure to full broadcast quality. We recommended and will be implementing a full Ross Video workflow, ready for 4K-UHD expansion.

“Third will be the integration of a motion capture and virtual camera system into the 4K-UHD chromakey studio which we installed last year. This presents an opportunity to incorporate new solutions that further enhance the film and television course modules. It will include areas such as virtual cinematography linked with the Mo-Sys VP Pro Unreal Engine plugin.

“The fourth element of this group will be a mobile production kit based on a StreamStar X4 capable of accommodating up to four camera channels plus a wireless camera link, LiveU LU300 field unit and Hollyland wireless intercom. This will enable students to capture live events on the fly, with integration back to studio.”

“CJP made a great job of the 4K-UHD virtual studio which we commissioned in 2021 for the David Puttnam Media Centre on the Sir Tom Cowie Campus, St Peter’s,” adds Craig Moore, Senior TV Technician. “The CJP team were the logical choice for this next phase in the modernisation of our creative and technical resources. Chromakey has long been one of the most powerful tools available to film and television producers. We are also investing in one of the largest EDU LED stages in the UK. This will incorporate the very latest technology and workflows for virtual production, enabling our students to gain the knowledge and practical experience of new concepts that will become industry standard. The system CJP has recommended and is providing will equip DPMC students with a true state-of-the-art solution which will open creative opportunities limited only by their imagination.”

Kieran Phillips tells us more about this new project below:

“It is important in the creative industries to ensure that our students get access to the technologies that are current and the technological changes that will influence the sector into the future,” states Professor Arabella Plouviez SFHEA, Academic Dean, Faculty of Arts & Creative Industries. “With this further investment in our virtual production studios, we will be able to ensure that our students have hands-on experience and also get to use their creative skills to challenge and push the technology. This investment provides exciting opportunities to bring together students with different skillsets – from TV, film, sound, photography, animation, performance and design as well as students from sciences such as technology and computing.”

“Through virtual production, the converging worlds of film, TV, games, animation and VFX are changing traditional film and television practices,” says Nicholas Glean, University of Sunderland Senior Lecturer in Video & New Media. “The new technological tools and skills needed for virtual production are also challenging traditional film and media production pedagogy. CJP is collaborating with us to navigate and integrate these new skills and tools into our programmes so that we can instruct a new generation of filmmakers. We are happy and excited to be working with them.”

“In addition to the major investment in virtual production, further investment in outside broadcasting equipment, studio cameras and an extensive refit of the vision gallery is fantastic news for our students as it enables them to use industry-standard equipment and learn a variety of new production processes that will place them at the forefront of a number of cutting-edge technologies which are now being used in high-profile productions such as The Mandalorian and the forthcoming series of Doctor Who,” summarises Sue Perryman SFHEA, Senior Lecturer. “This additional investment in outside broadcast technologies also means that our students can work on live, real-time productions, both inside and outside the TV studio, such as music, sport, dance, and performance. These opportunities will further develop students’ creativity as they gain the vital skills needed to work with new state-of-the-art production processes that are revolutionising TV production around the world. I, for one, cannot wait!”

The project is scheduled for completion in Q3 2022.

Mo-Sys demonstrates LED Virtual Production at MPTS 2022 

Mo-Sys Engineering will highlight how its comprehensive Virtual Production technology stack can benefit broadcasters and news providers at the Media & Production Technology Show (MPTS) 2022, where it is co-exhibiting with its partner, CJP Broadcast Service Solutions. Specialising in Virtual Production system integration, CJP Broadcast provides turnkey solutions backed by industry-leading technical support. 

Mo-Sys at MPTS 2022
LED Virtual Production

The Mo-Sys StarTracker camera/lens tracking system, now the technology of choice for leading-edge high-end TV, broadcast, and film Virtual Productions, will be shown working in tandem with Bendac Group’s InfiLED LED displays and LED video processing technology from Brompton Technology

“The benefits that come from integrating Virtual Production tools and techniques into broadcast environments are huge and we are only just scratching the surface,” said Michael Geissler, CEO of Mo-Sys. “We are delighted to join forces with CJP, Bendac and Brompton Technology to show how our solutions can make otherwise complex processes and workflows simple and straightforward. Our focus is to allow broadcasters and production companies to fully express their creativity without constraints so that they can deliver flawless, cinematic quality images to contend hungry audiences.” 

Mo-Sys will demonstrate StarTracker and highlight the specific benefits of VP Pro XR, its award-winning LED content server solution for Virtual Production. The team will be showcasing VP Pro XR’s immersive toolset, such as Cinematic XR Focus, the ability to seamlessly pull focus between talent in the real-world foreground and virtual objects placed behind the physical plane of the LED wall, deep into the scene. Content creators and broadcasters can learn about Mo-Sys’ unique capability to seamlessly switch multiple cameras within Virtual Production at full UHD4K resolution, XR set extensions and Augmented Reality for a virtual world beyond the boundary of the LED wall. 

In addition, the team will be showing Mo-Sys’ ground-breaking integration with Erizos Studio, which enables data-driven graphics and MOS integration in native Unreal Engine. This development means that broadcasters can now use a single platform to deliver all their graphics and utilise high quality LED wall backgrounds instead of a green screen to eliminate green spill issues completely. 
With its synergistic integration with LED volume providers, such as Bendac, Mo-Sys gives broadcasters and production companies an elegant all-in-one LED volume, multi-camera production system.

PLAZAMEDIA selects Mo-Sys as its primary XR solutions provider

 Mo-Sys Engineering today announces that its long-standing relationship with PLAZAMEDIA GmbH, a subsidiary of Sport1 Medien AG and an established content solution provider for the entire spectrum of media platforms and one of the leading producers of sports TV for German-speaking audiences, has been extended. The content solutions provider, which aims to raise the bar for Virtual Production with its new LED Studio, has chosen Mo-Sys as the primary XR solution provider for the implementation of its LED initiatives.

At the heart of PLAZAMEDIA’s decision is the unmatched capability and functionality of Mo-Sys’ technology and features such as multi-camera switching, Cinematic XR Focus and its latest innovation, the NearTime® on-set re-rendering workflow for ICVFX.

Jens Friedrichs, Chairman of the Management Board of PLAZAMEDIA GmbH commented: “Mo-Sys delivers all the performance we need to create a leading-edge LED volume studio. They understand the importance of delivering cinematic quality from end-to-end even for broadcast applications and productions for corporate clients – especially with regard to our clear focus on sustainable green production. This combined with their innovative toolset and the collaborative approach of the Mo-Sys team made them the unbeatable choice of partner for us.”

The flagship Mo-Sys StarTracker precision camera/lens tracking system is now the technology of choice for leading-edge Virtual Productions. The advanced tools in Mo-Sys’ VP Pro XR content server include set extensions, color-grading, and the unique ability to pull focus seamlessly between real and virtual objects made possible by the unique Cinematic XR Focus feature.

NearTime® is Mo-Sys’ patent-pending and award-winning solution for solving real-time VFX Virtual Production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, and delivers higher quality real-time VFX content. NearTime® also removes Moiré patterning completely and enables the use of lower cost LED panels to deliver an image quality that is far closer to post-production compositing.

“As pioneers in Virtual Production technology, we are driven by our passion to create tools that help our customers create immersive and engaging content without limiting their creativity,” said Philippe Vignal, Mo-Sys Director of Sales and Business Development, EMEA & APAC. “We are extremely proud that an innovator like PLAZAMEDIA has chosen to place Mo-Sys technology at the heart of its new Virtual Production capabilities.”

Mo-Sys and Ideal Systems Group partner for pan-Asian excellence

Mo-Sys Engineering is working with Ideal Systems to drive forward the adoption of virtual and augmented reality production in broadcast across Asia and the Middle East.

Ideal Systems has been a leader in media solutions for more than 30 years. From its head office in Hong Kong, it brings a wealth of experience together with a huge presence in 10 countries and a proven reach across the whole region. It provides consultancy, design, integration, installation and continuing support for many major names in broadcasting, media production and technology.

Ideal Systems

Mo-Sys has driven forward the emerging technologies of virtual and augmented production for 25 years. Pioneers in camera tracking with its unrivalled StarTracker system, and in camera robotics, the company can now deliver turnkey systems through its VP Pro graphics systems. Mo-Sys has been instrumental in the extensive use of realtime augmented reality and virtual studios in broadcast, and in the growing use of LED walls and LED volumes for movie production and for live events.

“The production of outstanding content is a global business,” said Philippe Vignal, director of sales and business development for Mo-Sys. “It is vital for us that we work with the best possible sales partners, to ensure our ground-breaking innovations are seen by all the key players. Ideal Systems is the perfect partner for us, as they have the broad reach across broadcast throughout EMEA and Asia.”

Jim Butler CEO of Ideal Systems added “Mo-Sys is a very impressive company. Its camera tracking and virtual production technologies are recognised by the biggest names out there, and they have very interesting cutting-edge developments in image robotics and remote production. The whole team is excited to be working with Mo-Sys and to be able to offer our customers new revolutionary solutions.”

Broadcasters and content producers can visit Mo-Sys at CABSAT (17 – 19 May, Dubai World Trade Centre, stand E6-12B). Mo-Sys will have a rich demonstration of its latest VP technologies, featuring StarTracker, its high precision optical camera tracking system, and VP Pro XR, its cinematic content server solution for LED production, along with the Mo-Sys VP Pro integrated augmented reality production system on the Ideal Systems stand A5-1.

Mo-Sys joins AOTO at NAB to showcase Broadcast LED VP Innovation

Mo-Sys Engineering will join forces with AOTO Electronics Co. Ltd., a specialist in LED application products, to showcase LED Virtual Production solutions aimed at broadcast news and factual programming at the NAB Show 2022 on the AOTO booth C3331.  

AOTO LED volume in action

Mo-Sys and partners are demonstrating:

  • XR set extensions and AR (Augmented Reality) for a virtual world beyond the boundary of the LED wall
  • Multi-Cam XR for live broadcast with clean, full-resolution camera/frustum switching
  • Data-driven graphics and MOS integration in native Unreal Engine

In a live LED Virtual Production demonstration, Mo-Sys will showcase its LED content server VP Pro XR and its precision camera tracking system StarTracker, working with AOTO’s 2.3 pitch LED tiles. The demonstration will show how set extensions can be used to expand the studio space, and how the newly released Multi-Cam Switching feature can be used to seamlessly switch between live cameras at resolutions up to UHD4K, and without the LED wall delay appearing in shot. 


“Creating outstanding content is how broadcasters can differentiate themselves in today’s fiercely competitive marketplace,” said Mo-Sys CEO Michael Geissler. “Bringing cinematic quality Virtual Production techniques into the broadcast environment gives customers the ability to innovate in a highly cost-efficient way and our collaboration with AOTO provides a clear demonstration of how our pioneering technology innovation and their unmatched cinematic quality LED tiles can deliver incredible results for broadcasters.”   

As part of the demonstration, and in a ground-breaking move, Mo-Sys will also show its integration with Erizos Studio, enabling Unreal Engine graphics to be used not just for the virtual studios, but for traditional on-screen graphics, such as lower thirds and data-driven graphics, either embedded in the Unreal scene or as AR objects. Erizos Studio provides the complete broadcast workflow, including industry standard newsroom computer system (NRCS) MOS integration. This development means that broadcasters can now use a single graphics platform to deliver all their graphics and utilize high quality LED wall background instead of a green screen to eliminate green spill issues completely. 

Michael Huang, Senior Account Manager, AOTO Electronics Co. Ltd., commented: “Mo-Sys’ Virtual Production innovation is well known, but this technology is ground-breaking enabling LED Virtual Production to be used across broadcast whilst still complying with tried and trusted workflows.”

Mo-Sys and VFX World partner to offer virtual production solutions

Mo-Sys Engineering has formed a powerful new partnership with VFX World, a leader in on-set VFX solutions for cinematic productions.

LED Volume from VFX World

Mo-Sys brings to the partnership a team of pioneering innovators with over 25 years’ experience of developing virtual production technology, and who today are at the forefront of LED volume innovation. VFX World brings extensive on-set VFX experience from major cinematic productions, blending traditional green/blue screen know-how with the latest in LED volume innovation. Together the two companies present a powerful combination of expertise and knowledge.

At the BSC Expo (7 – 9 April, Battersea Evolution London, stand 158) on the VFX World stand, both companies will show the core components of LED virtual production, and will detail new developments specifically aimed at cinematic LED virtual production.

The stand will feature a ROE Diamond 2.6 LED wall, Brompton SX40 processor, with Mo-Sys providing camera tracking using its widely recognised StarTracker system, and real-time virtual graphics using the Mo-Sys VP Pro XR LED content server.

“Combining experienced VFX onset crews with products from the leading innovator in virtual production technology today, represents a powerful resource for productions shooting VFX heavy content,” said Jem Morton, Director of VFX World.

“We are very excited to be partnering with movie specialist VFX World,” said Michael Geissler, CEO of Mo-Sys. “Together we are far ahead of other offerings in this field, with unique features offering Cinematographers greater creative freedom and improved virtual production imagery.”

The first generation of LED virtual production solutions utilized primarily live events technology, because that was what was available at the time. Mo-Sys’ VP Pro XR LED content server and StarTracker camera tracking system, along with higher quality LED tiles and LED processors, represent a technically superior approach to cinematic LED virtual production, providing increased fidelity and smarter workflows.

At BSC Expo, VFX World will be detailing two Mo-Sys patented technologies. The first is Cinematic XR Focus, a method of pulling focus from talent to virtual objects positioned behind the plane of the LED volume. The second is a unique solution for increasing the composite image quality from a real-time VFX production.

As well as the joint presentation on stand 158, Mo-Sys will also have its own stand, E15, where it will focus on its new G30 heavy-duty gyro-stabilized remote head. Engineered for high quality broadcast and movie work, it features an ultra-stiff frame and oversized high torque motors for precision movement and superior image stabilization.

For virtual production applications the G30 also includes built-in positional encoders, making it fast to set up, intuitive to operate and precise in tracking and stabilization.

Register for BSC Expo 2022 (7-9 April 2022) here >

Mo-Sys Innovation Solves Multi-Camera Switching For LED Volumes at NAB Show 2022

Mo-Sys Engineering today announced that its award-winning cinematic content server for LED volumes, VP Pro XR has been further improved to incorporate seamless multi-camera switching. The upgrade fully orchestrates multi-camera switching to overcome the typical 5-6 frame refresh delay experienced by all LED volumes when switching between cameras.

Mo-Sys VP Pro XR Multi-Cam
New VP Pro XR multi-camera switching feature to overcome the typical 5-6 frame refresh delay in LED volumes.

The company introduced VP Pro XR, the industry’s first cinematic LED content server solution, ahead of NAB 2021 where it picked up the prestigious Product of The Year award. Designed specifically for use with LED volumes, with or without set extensions, VP Pro XR delivers cinematic standards for LED virtual production.

“Switching between multiple cameras pointing at an LED volume presents a challenge” explained Michael Geissler, CEO of Mo-Sys. “While the camera outputs can be switched in 1 frame, the virtual scene displayed on the LED volume typically take 5-6 frames to update. This means on every camera switch there will be 5-6 frames of the previous camera’s virtual scene displayed on the LED volume, before it updates with the correct perspective view for the new camera. Effectively you get a background flash on every camera switch, which is unusable in production.”

With a new version of software for VP Pro XR, and the addition of a simple Black Magic Design video switcher, VP Pro XR will now orchestrate the delay between the operator switching between cameras, and the LED volume updating with the correct perspective view of the live camera. Importantly, it will do this at up to the full UHD4K resolution of the LED processor input, whereas previous workarounds would reduce the resolution of the content to HD to achieve the same outcome.

“Full resolution multi-cam switching is yet another unique feature of VP Pro XR joining Cinematic XR Focus for pulling focus between real and virtual elements, and NearTime for solving real-time VFX graphics quality,” concluded Geissler. “This follows on from the Cinematic XR objectives originally outlined when we launched our LED content server system last year.”

Mo-Sys Multi-Cam switching for VP Pro XR is available now.

For more information contact the Mo-Sys sales team on Tel: +44 208 858 3205 (EMEA & APAC) or +1 424 374 4011 (The Americas) or email

See the Mo-Sys Multi-Cam in action in this video:

Brand Marketers: What is Virtual Production and Why Would I Use It?

Mo-Sys Engineering’s Philippe Vignal uncovers the exciting opportunities and what brands need to understand about virtual production.

Read the full article published by Little Black Book March 31, 2022

For decades we’ve been shooting talent or products in green screen studios and adding (compositing) photo-realistic 3D graphics in post-production to finalise the shot. This is the foundation of a visual effects shot (VFX). It was and still is very much a two-stage process. Virtual production in most cases makes this a one-stage process, by enabling the VFX shot to be captured on set, in camera, in real-time. The virtual production process saves time, money, and is much more efficient. But it requires more preparation.

The enablers for virtual production are real-time photo-realistic graphics engines (e.g. Epic Games’ Unreal Engine, or Unity’s 3D engine) and precision camera and lens tracking. Using these technologies means that we can put talent or products into any photo-realistic environment we like, whenever we like, even change the environment at a flick of a button.

Here’s the bottom line. Brands, both big and small, are currently looking at ways to reduce cost and increase the speed at which they are producing content, because human eyeballs have become insatiable and require being fed on a daily basis.

Flying all over the world to different sets across numerous locations and using a plethora of resources as well as a large roster of talent is no longer cutting it. The majority of brands want to move away from this model. And it’s virtual production tools that are paving the way.

Technology advancements mean that we are now able to immerse talent in previously impossible environments, or environments that would be very, very expensive to build with physical sets. And we can achieve this in a highly seamless fashion, in a very cost effective way through virtual production.

No longer tied to the laws of physics, you don’t have to chase that golden hour any more for the perfect shot. You can have that golden hour last all day if you need it to. You can have true flexibility.

In traditional VFX production, you create your finished 3D scenes or assets ready for post-production. In virtual production your finished 3D assets are created in pre-production, ready to be used in full quality for production, meaning that post production compositing is either removed completely or significantly reduced and optimised.The net gain is that the overall time used to create the finished commercial is reduced.

Above all, virtual production is a craft that allows brands to tell their stories in new and enticing ways. And as more and more creators adopt the technology, now is the time to get to grips with what it entails and understand exactly how you can harness this tool as a brand.

Bring the Set to Life with LED Volumes

Virtual production is all about real-time VFX, and this can be done using a green or blue screen studio, with no screen (for augmented reality), or by using an LED volume which is the latest rage across the feature film and TV sectors.

Shooting in an LED volume involves capturing in-camera visual effects (ICVFX) shots of talent or products shot against photo-realistic 3D graphics displayed on the LED wall. The virtual graphics displayed on the wall accurately change in real-time providing the correct perspective view to match the tracked camera and its lens settings.

Everyone is asking whether they should be shooting in an LED volume instead of on location. But we must be careful – it is not a cure all.

Sometimes, location shoots are meant to be the solution. Sometimes shooting in an LED volume is going to be the solution. Sometimes you’ll need to combine an LED volume shoot with a green screen shoot. So brand marketers need to work with content production facilities that are able to offer that set of tools and the knowledge that comes with properly using those tools for virtual storytelling.

When you’re on set shooting virtual content on an LED volume, there’s always a compromise between the complexity of the virtual scene and the power of the workstations and graphics cards that are having to render it in real time so that the camera can capture the VFX shot in-camera in real-time.

Technology has certain limitations with regards to the ability of graphics cards to calculate in real time, so in order to maintain real-time playback frame rates on set, sometimes you need to compromise by reducing the quality of your virtual elements. So background elements that you are immersing your talent into aren’t going to be as high quality as you would want them to be, which is exactly what our patented NearTime rendering system is designed to solve.

During a real-time virtual production shoot with NearTime enabled (LED or green screen), as soon as a take starts, the camera tracking data and lens data are automatically sent to the cloud where the same virtual scene exists on multiple servers.

Using the tracking data, and with all the quality dials now turned up, the virtual scene is re-rendered and returned to set or to wherever shots are stored. Here, it is combined with the separated talent or product and the composite shot made ready for approval. The whole process is completely automated and costs for re-rendering are minimal. Importantly, NearTime enables a real-time virtual production shoot, with higher quality images, to be delivered in the same time frame window.

Shoot for Global Markets from Just One Location

One of the biggest benefits of virtual production is that you don’t have to be on location for shooting content. Not only that, but you can very easily change your virtual location background and use 3D Unreal based elements to help in versioning work.

Take brands such as P&G or Unilever who often need to create large amounts of versions for one product. With virtual production, they can shoot a piece of content for their European markets in one background, then change the location and product label virtually to shoot for their Asian and American markets – all from one place during the same shoot.

You can also change the actual product – the packshot – and map on different local labelling, perhaps using the same actor to do multiple language versions through artificial intelligence.

Suit All Budgets

With developments in tech, virtual production is now more accessible than ever. We now see certain phones built with the capability to LIDAR scan real objects to import into a 3D environment. Only a few years ago, this required very specialised tools but now it’s contained inside of a handheld device. This accessibility of tech has brought the cost down noticeably.

This is also true of software. We’re now well versed in software that used to be reserved for the gaming industry – things like Unreal and Unity game engines that have become ubiquitous in the sphere of advertising, feature and broadcast content production.

You can now also generate your own avatar to be immersed into virtual environments and interact with real elements. This would have taken a lot of production work a few years ago, and now it is much more readily accessible to a wider audience.

With the onset of this advanced technology packed into handheld devices, creators like bloggers have even started to adopt virtual production to combine with real-time motion capture. This is made possible through products like Mo-Sys’ StarTracker Studio, the world’s first all-in-one virtual production system for green screen production which comes with all the necessary cameras, trackers, software, and hardware to do virtual production, even where studio space is limited.

So, How Should A Brand Prepare for A Virtual Production Shoot?

At the base of all brand communication is creativity. And at the base of all virtual production is creativity. That will never change. Virtual production is not a cure all or a way to cut corners. If your content is devoid of creativity, it will be as barren in a virtual production environment as it will be in a live production environment. So you need creative people on your team that are technologically versed in Unreal and who understand production.

We are now seeing an increasing presence of hybrid profiles – creative people who also have technical knowledge. They understand Unreal and how it can be used in a production environment. They understand film or broadcast. They need to be curious enough to have trained in film and storytelling in these sorts of environments.

It is also important to remember that virtual production is a team sport. You cannot achieve it on your own, just as you cannot achieve live production on your own. It is a craft. Brands need people that understand virtual production in the same way as they need people on their teams that understand digital marketing. They both address the increased speed of content production needed.

Work with highly trained, highly skilled, curious individuals that can help you understand what’s entailed and who will help you liaise with either your own production departments in-house, or with your production and creative agencies.

Whatever you do, be well informed. Do not rely blindly on external knowledge. Everyone needs to be well informed of what the technical requirements are with virtual production. Work with consultants such as the ones we have at Mo-Sys who can help you navigate this new world of storytelling and technologies.

We are seeing a rapid adoption of virtual production tools. In London alone, there are over a dozen large LED volumes being currently built. We’re going to see increased use of virtual characters, avatars, AR content, entertainment, hybrid use of real-life humans interacting with virtual humans.

People are curious by nature, and we will always try to explore new things. Right now we’re fascinated with being able to reproduce ourselves as meta humans in the virtual world. Where that’s going to take us I don’t know but I do hope we keep our common sense whilst doing this and not forget our humanity and that the real purpose of this is to connect with other humans, not to isolate ourselves from the rest of humanity.

Curiosity combined with storytelling will be the focus over the next thousand years or so.

Mo-Sys to set up virtual production hub in London

Mo-Sys recently spoke about its plans to renovate and build virtual production stages and a virtual production research center with The Hollywood Reporter. The Plumstead Power station in Greenwich, which has sat unused for 50 years, will house eight stages, serve as the home to a new virtual production festival and become Mo-Sys’ headquarters in the UK.

The Plumstead Power Station in Greenwich which Mo-Sys will convert into virtual production hub.

According to Mo-Sys CEO Michael Geissler, the company — which last year won a Hollywood Professional Association Award for Engineering Excellence for its NearTime rendering workflow technology — plans to invest in the region of $7.2 million in the Plumstead Power Station in Greenwich and has secured a grant of around $5.5 million from local authorities as part of efforts to regenerate the area.


Read the full article published by The Hollywood Reporter March 24, 2022