Vū LED Ceiling Panels for Virtual Production

Diamond View Studios and Mo-Sys Technologies Create First in-Panel Tracking Solution for LED Volumes

Watch Diamond View Studios CEO Tim Moore introduce the new Vū LED Ceiling Panels in the video below:

Mo-Sys has been the most incredible technology partner because they’re always optimizing..

To have all these key features in one panel really makes it the most comprehensive LED ceiling panel on the market. This allows filmmakers to not worry about the technology and really just express their creativity.

Tim Moore, CEO of Diamond View Studios

Read more about our partnership with Diamond View Studios here.

The Future of Virtual Production

.. and Filmmaking with Holographic Display

Guest blog by Mo-Sys partner company VividQ

The gaming industry was one of the first early adopters of new immersive technologies, including virtual reality (VR) and augmented reality (AR). The shared use of CGI, VFX and game engines for content creation meant that game developers could easily pivot to creating new groundbreaking experiences for consumer audiences. With the increased use of CGI and VFX in movies, filmmaking is another creative industry starting to realise the great promise of immersive technologies. In AR display and tracking, in particular, recent developments are likely to set the path for virtual production to take over filmmaking as we know it today.

“If everything moves along and there’s no major catastrophe, we’re heading towards holograms.”

Martin Scorsese on the future of filmmaking (2011)

Within the traditional filmmaking process, directors and film crew employ a series of processes to decide the project’s scope at three points: before the film is produced, while the film is being created, and during the post-production phase. However, with the rise of new technological developments, directors are increasingly using ‘virtual production’ to mix live-action footage and computer-graphics at once. In an episode of the Unreal Engine ‘Visual Disruptors’ podcast, David Morin, Head of Epic Games LA Lab and chairman of the Virtual Reality & Virtual Production Committees, defined virtual production as “the process of creating a digital world which starts at the inception of a movie and ends with the final visual effects, centred on a real-time interaction on set.”

Using virtual production techniques, directors not only have creative flexibility to make quick, on-set decisions while in production, but have an entire process for creating a virtual world from scratch. In the pre-visualisation process of a film’s production, a director or creative team can view a specific scene’s visual renders. By visually mapping out the scenes using virtual production techniques, directors and set designers have more control to make decisions that can impact their desired content’s overall style and mood.

Introducing Computer-Generated Holography into the creative process

By engineering light, computer-generated holography (CGH) enables a user to experience digital content that appears at different distances. These holographic images are inherently three-dimensional and can be projected through a head-mounted device (similar to the Microsoft HoloLens), or holographic AR smartglasses.

So how can this next-generation display technology be deployed in the movie industry? Imagine the following scenario. A film director and set designers work on a new film and are in their project’s early pre-visualisation stages. The crew are all wearing holographic AR smartglasses to visualise the virtual set to collectively make crucial decisions in advance of building the physical set. The physical space they are in serves as a creative blank canvas, and the set designers can easily scan the room and map out all the surfaces they can use. The director can place, move and customise highly realistic holographic objects around the space in real-time, taking important information such as lighting and the actors’ placement into consideration. Normally, the director would need to commission new props or set designs at this stage, costing valuable time and money. However, by using holographic AR smartglasses, they can make smart performance-based decisions to accurately tailor their shots on set, eliminating the need for lengthy post-processing or expensive production reshoots.

There are a few reasons why CGH meets the requirements for virtual production to be successful:

Accurate depth of field

Depth of field – the apparent defocus between virtual objects – is hugely important when discussing any long-term consumer or enterprise AR use cases. Without it, virtual images in AR do not integrate naturally with the real world and appear disconnected from its surroundings. The viewer needs to be able to focus and defocus their eyes depending on when the digital content is placed, for example between the foreground and the background. Today, most 3D and AR devices rely on stereoscopic displays, which not provide viewers with any depth cues. Those stereo virtual objects lay on a single focal plane, causing a feeling of disconnect and, often, nausea.

In virtual production, depth is significant for ensuring accuracy when placing virtual ‘props’ around the scene. AR devices using CGH guarantee precise environment matching thanks to advanced tracking systems where the digital content can appear at any depth plane. Integrating tracking systems into AR wearables means users can experience unlimited freedom of movement and flexibility around a scene with full-depth. Tracking technologies, such as the Mo-Sys StarTracker VR, allow limitless, multi-user free-roaming experiences within AR and VR environments. StarTracker’s technology uses an inside-out system, which allows the device to find its position in space by tracking small reflective stickers in real-time. This feature allows virtual objects to be tracked and placed at accurate depths over a large physical distance in real-time in a virtual production scenario. Virtual objects can be ‘docked’ in physical and virtual space, meaning they will remain frozen in place when the user looks or walks away. Accurate and reliable placement of virtual objects allows a director to see exactly where the eventual film viewer will be focusing their attention.

Realistic Image Quality from a variety of 3D Data Sources

When we view the world around us, we are actually observing complex patterns of light reflected from objects. Our visual systems interpret these complex light patterns as the objects we see. Using holography, we can engineer the exact complex light pattern that is reflected off a real object and display it to an observer. The engineered light pattern is visually indistinguishable from the real object. Computer-generated holography creates holograms digitally from various 3D data sources such as game engines, depth-sensing cameras and CAD software. Therefore, this method uses software to create a light pattern of an object without using a real object at all. This allows the creation of dynamic and interactive holographic virtual images.

Using software tools for 3D creation and design such as Unity and Unreal Engine, holography offers an easy way to create, use and customise high-quality virtual objects in real-time. This allows directors and their creative teams to have more freedom to make set design choices on an open canvas.

CGH also benefits from superior brightness, a wide colour gamut and high image contrast compared to other currently available AR headsets, producing high-quality holographic images that are visible at all lighting levels with ease and comfort.

The future of filmmaking

As holographic displays are integrated into the next generation of consumer devices, we will see the creation of complex virtual objects, scenes and characters that can be directed in the same way as real actors. In the future, CGH will be used not only for virtual production purposes but for storytelling itself. The dream of a holodeck will one day become a reality, inspiring the next generation of directors following Scorsese’s vision.

The future of virtual production and holographic filmmaking is nearer than we think. Thanks to the collaboration between VividQ and Mo-Sys, we get closer to bringing in a revolution in the film and entertainment industry.

Click here to learn more about our partnership with VividQ.

Click here to the VividQ whitepaper ‘Holography: The Future of Augmented Reality Wearables’.

This article was originally posted on the VividQ website here.

Mo-Sys forges AR partnership with VividQ

Precision positioning of 3D models in augmented reality devices using computer-generated holography

Mo-Sys has joined in partnership with VividQ, pioneers in computer-generated holography for next-generation augmented reality (AR) displays. This allows 3D holographic projections to be placed precisely in real space, enabling users of future AR devices, like smart glasses, to explore virtual content in context with the natural environment. 

Mo-Sys StarTracker is a proven and powerful camera tracking technology, widely used in television production and other creative environments for applications from virtual studios to realtime set extensions. It provides precise location for the camera in XYZ space, and with free rotation. 

VividQ software for computer-generated holography is used in innovative display applications from AR wearables, to head-up displays. Holography – the holy grail of display technologies – relies on high-performance computation of complex light patterns to project realistic objects and scenes, for example in AR devices. VividQ generates holographic projections which, thanks to the precision location of Mo-Sys, can be displayed to the user at the correct place in the real environment. This is a major advance on today’s AR devices where flat (stereoscopic) objects are mismatched with the real world. By presenting holographic projections with depth, the user’s eyes can focus naturally as they scan the scene. 

“The possibilities and applications of augmented reality in realtime devices are only just being explored,” said Michael Geissler, CEO of Mo-Sys Engineering. “We are at the cutting edge of camera tracking; VividQ is at the cutting edge of computer-generated holography, and we are excited to work together to bring some of these concepts to reality.” 

Darran Milne, CEO of VividQ added “Our partnership with Mo-Sys is key to understanding the potential of computer-generated holography in future AR applications, developing experiences where virtual objects can blend seamlessly into the real world.” 

A fresh approach to virtual production

Guest blog by Tyler Mayne, CEO of Mo-Sys partner company Evolve Technology

My company’s background is in supporting the live events industry, and we’ve been in the business for over a dozen years now. Our goal is to be our customers silent partner offering them the tools and technologies necessary to help them grow their businesses. We do this by offering equipment for Rent, Sale and Lease, backed up by our amazing technical team and Academy that offers factory-certified training curriculum. In addition, we work closely with key manufacturers to carefully evaluate the best equipment for our customers’ needs.    

What we are seeing now is a tendency for the techniques from our industry to cross over into the movie and television production. Of course, studios are already experienced in using virtual elements and set extensions. But they have always done it in post: shoot the live action against a green screen and composite it all later, in a costly and time-consuming separate phase. 

The new interest is in doing it live. The way we have been doing it for years. Studios are kitting themselves out with high resolution LED screens and precision motion tracking, so they can create rich visuals in camera, with a minimum of post. That is going to make production faster and more cost-effective. 

What it depends on, though, is having the right technologies, and having people who understand the real implications of working on a film set. If you have a handful of Hollywood A-listers on set, you cannot afford to keep them hanging around while a bunch of technicians struggle to get it all working and aligned. 

The best technologies are those which will reliably deliver with the minimum of set-up. Take motion tracking as an example. 

This is absolutely central and critical to virtual and extended reality production. If the computers do not know, to a very fine degree of accuracy, where the cameras are and where they are looking, then the virtual images will not stay in the right place and the illusion will be instantly lost. 

At Evolve, we say we are technology agnostic: we are only interested in stuff that works. We do the research, so you don’t have to. And, when it comes to motion tracking, we think Mo-Sys StarTracker works really well! 

In its basic form, it needs no specialist staff or complicated set-up. You stick the stars (they’re reflective dots) on the ceiling, the tracking cameras and software find them, and off you go. Anyone capable of finding the power-on switch can drive it. 

To get the very best out of the system you need to calibrate the lens to the software, so it understands the precise effect of focus and zoom on that particular piece of glass. We can teach a competent technician, who will be on set anyway, to do that in a day. That’s a skill they then have for life. 

Mo-Sys is not the only motion tracking game in town. But what it does offer – apart from all the accuracy you need – is this simple user interface. Some of the others look fine on paper, until you realise you need a technician to come in and calibrate the system for you. Which is more cost, more inconvenience (especially at the moment, when international travel is not at all easy) and more sources of delay when your leading actors are ready to go – but one small but critical piece of technology needs an expert to set it up, and who can’t get to you until a week next Thursday. Much better to have the skills in-house. 

In an industry of constant change and innovation, we feel that training is an essential part of a well-rounded, knowledgeable professional. Through our Academy, Evolve provides the best in education series, lectures and product demos in order to enhance the expertise in our ever-changing industry. Including the use of Mo-Sys StarTracker. We look forward to rolling out our new suite of classes designed to help transition live event technicians to the exciting virtual Film/TV market early this year.   

Evolve Technology chooses Mo-Sys StarTracker

Leading US live events specialist chooses precision camera tracking from Mo-Sys to add to its inventory and training schedule

Mo-Sys Engineering, world leader in precision camera tracking solutions for virtual studios and augmented reality, has established a partnership with Evolve Technology, which includes an initial purchase of several Mo-Sys StarTracker camera tracking systems. Evolve is a market leader in live events and virtual productions in the US, providing independent technical advice, sales and rental, and training workshops. 

The Mo-Sys StarTracker system is unique in using “stars” (reflective dots) positioned at random, usually on the ceiling or lighting grid of the studio or venue. Once learnt, an automatic process, StarTracker can then provide ultimate precision in camera location and orientation through all six degrees of freedom. In conjunction with lens data, it means a virtual environment system, such as the Unreal graphics engine, can be matched to real images from the cameras with perfect stability. 

Mo-Sys is the latest of Evolve’s “Trifecta” partnerships (rent, buy and learn). As well as having Mo-Sys StarTracker systems available for rent or purchase, they will also be used as a training aid and will play a key role in virtual production and extended reality workshops. Evolve also plans highly targeted courses on StarTracker lens calibration. These courses are expected to begin in April. 

“Our job is to keep up-to-date with the best technology, doing all the legwork so you don’t have to,” said Tyler Mayne, CEO of Evolve Technology. “We looked at the market, and we saw Mo-Sys StarTracker as the versatile, user-friendly and cost-effective solution. We’re confident in pitching Mo-Sys because it’s a really robust platform.” 

One trend Evolve is already seeing is the increasing use of technologies from the live events industry, like augmented and extended reality, used on film shoots, with VFX houses taking a key role to shoot and composite on set in real-time, rather than compositing in post-production, using new technologies such as LED panels. 

“We can make things easier for the studios, because they are in the business of telling stories, not technology,” explained Mayne. “We can sell, rent or lease them the entire eco-system, based on the best brands, like Mo-Sys, which is why we are looking at tightening our partnership even further.” 

To boost support for this emerging studio business, Evolve is building its first lab in Atlanta, to be shortly followed by a second on the West Coast. These will be fully fitted studios where Evolve will be able to demonstrate the technology, as well as giving VFX houses a place to test out their assets before the shoot.  “We are very excited by this partnership with Evolve,” confirmed Michael Geissler, CEO of Mo-Sys Engineering. “They are a perfect fit for us: not only are they across all the technologies, but they understand the rapid changes in the industry and the way that extended and augmented reality is developing into new areas. We look forward to solving many new challenges with Tyler and his team.”

First In-Panel Tracking Solution for LED Volumes

TAMPA, FL, Jan. 06, 2021 – Today, five-time Emmy award-winning video agency Diamond View debuted “Vū LED Ceiling Panels,” a new patent pending translucent LED ceiling panel technology that features retro-reflective technology integrated directly into the panel, making it the world’s first in-panel tracking solution for LED Volumes. The camera tracking technology and the design of the retro-reflective markers for Vū panels are from Mo-Sys Engineering, a global leader in virtual production products for the film, broadcast and entertainment markets. 

Earlier this month, Diamond View announced the construction of its new 240-foot LED Volume, which features the Vū LED Ceiling Panels, in partnership with Unreal Engine, Sony, Mark Roberts Motion Control, NVIDIA, Aputure Lighting and Mo-Sys Engineering. 

The translucent ceiling panels feature a newly engineered pixel suspension system that allows light and theatrical effects to pass directly through the back of the panels without affecting the image quality of the video. Each Vū LED Panel is less than half of the weight of competitor LED panels and utilizes Ultra Bright 4500 NIT LED’s. The panels have been optimized for ceiling use and reduces the overall heat load and acoustical challenges that are common in closed LED Volumes. This new panel design also comes equipped with fully integrated infrared tracking reflectors on each panel to triangulate the camera positional data with the Mo-Sys StarTracker system while not obscuring the LED Pixels.

Mo-Sys’ StarTracker System is an absolute based optical tracking system that has become popular with the rise in demand for virtual production because of its ability to produce accurate and reliable position data from a single three inch optical reader placed on the camera. 

Michael Geissler, CEO Mo-Sys Engineering comments, “We’re delighted to have worked with Diamond View on their new Vū panel. This will enable Cinematographers to have more creative freedom with the camera angles they choose inside an enclosed LED volume, and with additional creative lighting shining through the LED volume ceiling.”

Diamond View’s CEO Tim Moore says, “With new LED Volumes being built all around the world each month, we believe these newly released Vū Ceiling Panels are positioned to be one of the most comprehensive LED ceiling panels on the market.” 

The Diamond View team spent the last six months in development of its new custom ceiling panels and partnered with the team at Mo-Sys to seamlessly integrate the tracking technology into the design.

Vū LED Ceiling Panels are available for purchase at www.vupanels.com.