A beginner’s guide to optical and mechanical tracking

It can be difficult to determine which type of tracking system would be best suited to your production if you do not understand the difference between various types of tracking systems. In this article, we compare the most popular types of tracking systems deployed across the broadcast industry for augmented reality graphics and virtual studios. In particular we’ll be looking at mechanical and optical tracking systems and we will try to give you a few beginner guidelines to bear in mind.

Mechanical tracking systems

Mechanical tracking systems are based around the use of measuring sensors or encoders attached to a crane,camera heads or pedestals. These encoders will give angular and positional measurements that allow an operator to accurately track the position of the camera.

Usually used for live sports coverage or outdoor broadcasting, mechanical systems are robust, reliable and are equipped with the capability to offer fine, high-resolution accuracy. A mechanical tracking system such as our e-Crane or the Jimmy Jib Tracking kit can offer resolutions of more than 1,000,000 ticks per 360º turn, meaning there is little or no tracking noise at all. This is perfect for AR graphics when broadcasting live in UHD.

Red Bull Air Race 2015: V70 Head with Mo-Sys Bolt on Tracking

These systems do however require regular homing and alignment as they tend to drift over long periods of time; homing ensures the camera is relatively aligned to the virtual or augmented graphics. This happens because incremental or relative encoders have no awareness of their absolute position when switched off and on again. Each axis must be manually homed from 0º to the correct angle on every use. Naturally, this can be a source of frustration for camera operators and can take up a lot of time on-set.

Backlash can also become a problem when using mechanical tracking. It is especially noticeable when changing the direction of the camera, hence why camera operators tend to deploy slow panning movements with minimal tilt. With this method operators can avoid backlash which causes the graphics to drift.
It is possible to use a mechanical tracking system with relative encoders for lens data, but it is very important to mount the encoders securely to avoid backlash and slipping.

Having said this, once you have aligned and homed a mechanical tracking system and it is used in an appropraitely cautious way, then this method can be one of the most robust and reliable ways to track cameras for live broadcast.

Optical tracking systems

Where mechanical tracking are restricted in terms of base-movement, optical tracking systems such as StarTracker truly excel, offering a greater level of freedom of movement across an entire studio. Below, we focus on three major varieties of optical tracking systems: Outside-in tracking, and two inside-out systems, free-d and Mo-Sys StarTracker.

Outside-in

Outside-in tracking uses a ring of camera sensors to create a sweet spot for tracking. This sweet spot is the only area able to be tracked and, due to the nature of the sweet spot only occurring in a specific area, it is not possible to achieve any wall-to-wall tracking with this method.

Outside-in tracking is restrictive and requires a tracking specialist on-site to assist with thecalibration. It is however an absolute tracking technology, which means that no drift will occur and once set up, it will be ready to use for a prolonged period of time.

As expected, outside-in tracking is an expensive technology, with the price increasing in-line with the area that needs to be tracked as more camera sensors are needed. Plus, while these systems require little calibration while in operation, they require time-intensive recalibration every time they are set up.

Inside-Out: free-d

Developed by the BBC in 1996, the free-d optical camera tracking system was the first absolute optical tracking systems for AR and virtual studio application. Ahead of its time, free-d was ground-breaking and very successful, and until recently it was still available as free-d2 from Shotoku.

Unlike outside-in systems, free-d used a single camera mounted on a broadcast camera to track circular barcode fiducials on the ceiling. These circular barcode fiducials enabled the centre to be found from any angle, reporting all six degrees of the camera position and therefore enabling more freedom than before.

However, by modern standards free-d required a fairly complex set up with each unique barcoded target needing to be mounted on rigid metal poles and plates suspended from the ceiling. Whilst only 5 targets were needed to be seen for 1mm-level tracking, the relatively large size of each target meant the density of targets had to be relatively high to assure visibility given occlusion from ceiling lights. This made repositioning lights more complicated and potentially required a degree of recalibration if some targets were knocked out of position. In addition, the limited computing power available at the time resulted in having to create custom processing hardware, which in turn made it difficult to further develop the 
free-d system.

Inside-Out: StarTracker

nstead of the coded fiducials used by the free-D system, the Mo-Sys StarTracker technology is an inside out system which uses a constellation of small, identical retro-reflective markers (‘stars’) which are not encoded. A small LED sensor, mounted on the studio camera, shines light on the stars. This defines the star map, which allows the StarTracker  processor unit to report the position and orientation of the studio camera in real time to the rendering engine.

The main appeal of StarTracker is its ease of use and freedom of movement. It is an absolute tracking system with the ability to track any space from wall-to-wall that can be set up and used entirely by the cameraman. It is also far cheaper than any complex outside-in tracking. Another appeal of StarTracker is that the area of tracking can be increased by simply putting up more stickers.

To date, StarTracker powers more than 100 broadcast studios across the world. The biggest broadcasters often choose StarTracker as it does not require a tracking operator. Once it is set up, it just tracks and maintenance is minimal.

Which tracking is most suitable for you?

Determining whether mechanical or optical tracking systems are better for you will depend on the situation that you face. An optical tracking system will give you more freedom and is absolute, meaning that it is free to use with minor homing following the initial set up and calibration. Mechanical tracking, on the other hand, is usually more complex and requires regular homing but whilst in use it is a robust, solid option for tracking.

When tracking outdoors, many studios will opt for mechanical over optical tracking due to the difficulty in placing optical markers outside. Marker-less optical tracking systems which use natural markers outdoors are currently available but are heavily affected by changing lighting.

For ease of use and reasonable pricing, optical tracking systems like StarTracker may be most suited to your in-studio production.

If you need more assistance in determining which kind of tracking system would be best for your situation get in touch with Mo-Sys today.

Behind the scenes of Wimbledon’s brand new virtual interview studio

Graphic specialists MOOV created an impressive photo-realistic virtual set for post-match interviews featuring the likes of Roger Federer, Cori Gauff, Rafael Nadal and this years’ winner Novak Djokovic. Situated in media centre, the virtual set composites presenters and tennis players into a virtual interior inspired by the design of the historic club.

Graphic specialists MOOV created an impressive photo-realistic virtual set for post-match interviews featuring the likes of Roger Federer, Cori Gauff, Rafael Nadal and this years’ winner Novak Djokovic. Situated in media centre, the virtual set composites presenters and tennis players into a virtual interior inspired by the design of the historic club.

With limited space across the grounds, using a virtual set offers a practical and flexible solution for the tennis coverage. For this, they have chosen to use Mo-Sys StarTracker as they needed a camera tracking system that is reliable and is ready-to-go for their daily coverage. MOOV are also utilisinh The Future Group’s brand-new virtual production system Pixotope for the first time.

Nev Appleton, director and co-founder of MOOV said: “It’s really important for MOOV to continue investing in the latest and best technologies, ensuring our clients have the right solution for their project. Choosing StarTracker was a no-brainer, as we found it to be an incredibly reliable and accurate tracking solution. Together with Pixotope, which is designed with creativity, speed and flexibility in mind this is the perfect combination for our constantly changing needs”.

For the tracking, MOOV are utilising the Mo-Sys StarTracker, a robust and patented camera tracking system which uses small retro-reflective stickers that are attached to the ceiling. The sensor uses the position of the stickers to gather reliable and highly-precise tracking data, enabling the studio cameras to be moved when overlaying augmented graphics or using a full virtual set.

Michael Geissler, CEO Mo-Sys said: “We are extremely excited to be interfacing our tracking and new lens calibration method to Pixotope for the first time. We look forward to being part of their upcoming projects and pushing boundaries with it.”

In addition, The Future Group’s subscription software Pixotope offers broadcasters and graphics a unique combination of performance and creativity and is designed to streamline the virtual production process. It is the only system on the market that offers native Unreal rendering with WYSIWYG (‘what you see is what you get’) editing, providing the highest quality photo-realistic graphics that can be modified quickly.

“We’re delighted to be working with MOOV and Mo-Sys on this year’s Wimbledon coverage. Pixotope was designed for exactly these types of live ‘to air’ productions, where the customer demands photo-realistic virtual content, an ‘always live’ editing capability, and an operational interface built for speed. Pixotope is the natural result of a talented engineering team and an award-winning live events creative team coming together to produce a new type of virtual production system” says Halvor Vislie, CEO The Future Group.

StarTracker Serves Ace Augmented Reality at Wimbledon 2019

With broadcasters pushing boundaries with real-time augmented reality and virtual sets, it’s no exaggeration to say that sports coverage has also dramatically improved over the past few years. To stay ahead of the game, sport & events graphics company, MOOV have adopted the most advanced hardware and software to deliver immersive, highly engaging content that covers every angle for viewers.

For the live coverage of Wimbledon 2019 tennis championships, BBC and MOOV have created an impressive set for interviews and daily post-match analysis. Situated within the onsite broadcast centre, the set features a range of augmented reality grahics. 

MOOV are using RT Software’s AR solution, which is based on a 3 camera (2 pedestals and 1 jib) setup which utilises the StarTracker camera tracking technology from Mo-Sys. This optical tracking system gathers reliable and highly-precise tracking data, enabling studio cameras to be moved when overlaying augmented graphics.

For both Mo-Sys and Real-time 3D broadcast graphics developer, RT Software, this is the first time their Augmented Reality solutions have been in use at Wimbledon. Extending the studio with AR graphics allows the BBC to enhance the viewer experience with high impact graphics that would not be possible in a traditional studio. AR Graphics include virtual life size cutouts of players on glass plinths in front of the presenters, and a virtual “Magic Window” looking out over a 3D render of a tennis court with results and match data laid on to the court surface.

As well as giving the appearance of perspective and parallax to the ‘Magic Window’, camera moves include a passing shot from the glass plinth to reveal the presenters behind. Each camera feed is passed through a single 3d-Live render engine (also from RT Software) which renders both the player cutouts and the magic window in real time. Control of the render engines is provided using the WebControl application, a browser based application providing users remote control over the application of graphics from any position on the network.

Nev Appleton, director and co-founder of MOOV said: “We couldn’t be happier with the results that we have achieved using the 3d-Live render engine from RT Software, as well as the StarTracker camera tracking system from Mo-Sys. We are always looking to use the best software and hardware for Wimbledon and this year is no exception.”

BBC Sport Studio: StarTracker mounted on jib with optical tracking good

Michael Geissler, CEO Mo-Sys said: “With Wimbledon being such a key event in the UK’s sporting calendar, we are extremely excited to be working with MOOV and providing StarTracker for this year’s coverage”.

Mike Fredriksen, Commercial Director of RT Software added “RT Software is delighted to be working with the BBC at Wimbledon, providing the underlying graphics technology for the studio AR graphics. RT Software has long since been associated with virtual studio and AR graphics at flagship BBC coverage including elections, Eurovision and key sporting events but this is our first Wimbledon. Working with Moov and Mo-Sys is always a pleasure and we are pleased the partnership yields such great looking results.”

What is the difference between AR and VR Broadcasting?

Augmented reality (AR) and virtual reality (VR) are two of the most exciting and up-and-coming technologies associated with broadcasting. They enhance our experience of the digital world by altering our perception of the world presented to us. However, the terms AR and VR are often mistaken for each other with many unclear on the distinction between the two. In this article, we will simplify it for you and explain exactly what is the difference between AR and VR broadcasting.

What is AR broadcasting?

AR is when a layer of computer-generated images are overlaid onto objects or people that are present in the real world, rather than entirely replacing the background like VR. In the world of broadcast, AR adds value to our viewing experience and helps us digest information in a more visual and increasingly more interactive way.

One of the major technical benefits of using AR is that it is simpler than VR. It does not require a green screen, nor specific lighting or keying; graphics are simply projected onto the real-world footage whether you are shooting indoors or outdoors. Augmented reality can be challenging to get right, however. AR objects can often clip through the floor or other objects and can also  float around the space – this is especially bad when AR interacts with the real world.

Recently, AR is becoming far more realistic thanks to improvements in real-time tracking technologies such as StarTracker. New technological developments are helping to make AR broadcasting even more engaging for audiences, with the quality of news, elections and live entertainment coverage improving constantly.

Examples of AR broadcasting

Here at Mo-Sys, we’ve worked with the likes of the BBC, ABC, ESPN and The Weather Channel, by implementing our tracking technology to help them achieve high-quality AR broadcasting. For this year’s Oscars, E! used AR graphics to showcase the biggest talking points and highlights of the Academy Awards ceremony. Using our Mo-Sys bolt-on tracking kit CGLA studios implemented AR videos and suspended them over the pool at the Hollywood Roosevelt Hotel.

Check out the video below.

What is VR in broadcast?

Strictly speaking, the kind of virtual reality technology used in broadcasting is actually ‘virtual studio technology’. Virtual studios did come before augmented reality technology, but struggled initially due to how expensive and complex the technology can be. Virtual reality aims to replace our reality with a completely fabricated version (rather than simply adding to our reality like AR). Although virtual reality primarily relates to the use of a headset which replaces what we see, and often what we hear, the term VR can also relate to broadcasting too.

Whilst AR overlays objects or graphics onto the real-world, VR broadcasting uses a green screen to combine two images or video streams together allowing the background of the subject to be removed. The result for viewers at home is a presenter that is completely immersed in a virtual environment, empowering numerous creative opportunities and countless set changes at the flick of a switch.

However, achieving virtual reality broadcasting requires many technical components to come together for a live production, with no room for error. Fortunately, our industry-proven StarTracker system is an incredibly reliable system that provides accurate tracking data for VR.

Examples of VR in broadcast

The Weather Channel demonstrated just how powerful the use of both AR and VR can be to show the devastating effects of extreme weather. With our StarTracker technology, they were able to use AR and VR to simulate different weather events and ‘destroy’ their own studio on live TV. The result was an incredibly striking broadcast that has engaged with viewers worldwide.

As you can see at the beginning of the broadcast, The Weather Channel presenter Jim Cantore interacts with informational AR graphics and augmented objects such as cars, telephone poles and destroyed houses all the while standing in front of a large video wall. Later in the live broadcast, he is then composited directly into the aftermath of the tornado using a fully virtual environment. Of what they are labelling ‘immersive mixed-reality’, the entire broadcast which utilized StarTracker pushed the boundaries of real-time AR and VR for broadcast. The Weather Channel continues to use StarTracker for their immersive mixed reality explainers every month. You can read more about the making of The Weather Channel’s AR/VR broadcasts here.

Here at Mo-Sys, we are looking forward to more broadcasters adopting AR technology to bring their shows to life, both from an informational and entertainment perspective. Find out more about our work on the cutting edge of camera motion systems by reading our blog or getting in touch with us today.

StarTracker elected for 2019 Spanish general election

As a political event guaranteed to go down in the history books as an emphatic night for Spain, we at Mo-Sys were thrilled to have been heavily involved in the coverage of the 2019 Spanish general election. Broadcasters Telemadrid and TVE used our well-established and recently patent granted StarTracker to track multiple cameras, enabling them to inform viewers using visually engaging augmented and virtual reality graphics.

TVE

Televisión Española (TVE) also laid on a special broadcast following the closure of the polling stations. More than 2.5 million viewers tuned into TVE’s Elections 28-A for an immersive post-election voting analysis from Study 1 of Prado del Rey.

In the special program presenters Carlos Franganillo and Ana Blanco, as well as students of Journalism and Political Science participated in an informative, dynamic show. Featuring an LED videowall of more than 40 meters and effective augmented reality graphics, spectators could see the Congress of Deputies recreated both outside and inside, with the results being updated minute by minute.

For the production of the special, Vizrt teamed up with Mo-Sys TVE had a five camera set-up equipped with five StarTracker systems from Mo-Sys (supplied by Moncada y Lorenzo that delivered the data to the Vizrt render engines . The five StarTrackers were placed on 2 pedestals, 1 telescopic crane, 1 steadicam and 1 Camrail all within a studio size of 33×24 m and height of the 8.5m.

Telemadrid

For Telemadrid’s ambitious coverage, we helped them achieve one of the most striking proposals on election night. Throughout the night, they alternated between classic sets, screens, augmented elements and a completely virtual environment in which the presenter was superimposed.

The augmented reality and embedding of graphic elements were a great achievement for Telemadrid, doing it across a 500-square-meter set which was something that had never been done before. Their post-production and graphics team, led by Antonio Tena, collaborated with video game specialists and experienced technicians to reconstruct the Congress Chamber of Deputies and representative 3D election maps. To do so, they needed the integration of state-of-the-art technology from Datos Media Technologies, Avid and Mo-Sys with our StarTracker system supplied by Moncada y Lorenzo.  

For more information about Mo-Sys’ StarTracker technology, don’t hesitate to get in touch with a member of our friendly team at info@mo-sys.com.

Mo-Sys tracking chosen for the Oscars

For this year’s Oscars, E! gave their coverage a cosmetic face-lift, debuting augmented reality for the first time in their 24-year reign as official red-carpet commentators.

E! brought in CGLA Studios, Full Mental Jacket and Mo-Sys to provide a variety of possibilities for augmented reality, enhancing the countdown and the post-ceremony shows. Throughout the coverage, floating AR videos were shown hovering over the pool at the Hollywood Roosevelt hotel, showcasing the biggest talking points and highlights of the Academy Awards ceremony.

To achieve this, CGLA Studios had a two-camera setup at the Hollywood Roosevelt hotel situated a block or so from the Dolby Theatre. One of the cameras used a Vinten 70 head with our Mo-Sys bolt-on encoding kit, also a crane position used our external CamMate kit for tracking. CGLA chose our encoding kits as they needed a reliable, travel friendly option that guaranteed accurate tracking data.

This project was backed by Ross Video’s XPression powering the AR graphics with Full Mental Jacket designing and operating the visuals and with virtual production support by Paul Lacombe from Unreel Pictures.

Chris Marsall, owner of CGLA Studios said “For this year’s E!’s coverage of the Oscars I wanted to do something different and sold them on the idea of enhancing their broadcast with AR graphics during the Countdown show as well as the Post show. As this was E!’s first foray into AR graphics we needed a solid and reliable tracking system. We used 2 systems; the Mo-Sys Vinten V70 Kit and also the Mo-Sys crane encoding kit for tracking. They are both robust and travel friendly options that guarantee accurate tracking data. Next year we will hopefully be using a full Unreal set-up with AR and you can be sure we’ll have a full complement of Mo-Sys encoders”.

Mo-Sys offer a variety of film remote systems, broadcast robotics and camera tracking equipment. For more information on the services on offer, please contact us at info@mo-sys.com.

ABC sweeps across USA with StarTracker

ABC News covered the 2018 US midterm election with a custom-made 360-degree stage — complete with an interactive, augmented-reality (AR) experience which utilized our real-time camera tracking system, StarTracker.

As the election results were coming in, ABC could use AR with 3D images of the U.S. Capital to give a live voting update. Thanks to the precisions and robust quality of StarTracker, the presenter could be fully immersed within the House of Representatives and Senate, even with constant changes to light in the studio.

“The goal was to create a set that both the viewer and the presenters so could understand visually a very important election,” Hal Aronow-Theil, Creative Director at ABC News.

It took the ABC news team a whole year of planning, brainstorming, designing and programming to get the augmented reality coverage to come together. Using our StarTracker and high-quality graphics from Vizrt, the ABC news coverage featured a large number of AR pieces throughout the evening, featuring real-time data visualization, maps, and a 3D model of the U.S. Capitol.

There’s great potential for creative, interactive storytelling to help the viewer better understand complex information,” said Tamar Gargle, ABC News Director of Graphics Operations.

This is not the first time StarTracker was chosen for a live election night. The BBC have repeatedly selected our system for their coverage of the UK General Election and the EU Referendum; with Jeremy Vine famously walking through an augmented version of the house of commons with live voting results being displayed.  

For ABC News they have been experimenting with AR and our real-time camera tracking system for a while now. StarTracker was used to incorporate AR into their live coverage of the British royal wedding earlier this year. Furthermore, StarTracker was also used on  ABC’s ‘Good Morning America’ programme for its first true AR medical story about heart disease this past spring.

“Where we would normally cut away to a 3D animation of clogged arteries, Dr. Jen Ashton was able to interact in the studio with a three dimensional AR heart to help the viewer better understand heart disease,” according to Aronow-Theil and Gargle.

Gravity celebrates its 5th Birthday

The boundary-pushing, award-winning Sci-Fi epic, Gravity was released 5 years ago this November. To celebrate it’s birthday, we have put together an in-depth case study with insights from the crew, a breakdown of the VFX film making from Framestore and a behind the scenes video of our specially designed Mo-Sys Lambda in action. 

Gravity cinematographer Emmanuel ‘Chivo’ Lubezki, can appreciate the advantages of our remote heads with reliability and robustness being the two major keys for high end film productions. After initial tests in San Francisco, Chivo and his team were convinced that the Lambda with its three axes, high precision and zero backlash was the perfect match for the Bot&Dolly – a high precision robotic motion control rig. Following the success of the custom-made Mo-Sys Lambda head for Gravity, we developed our current, industry standard L40 featuring a light weight 2-axis with back-pan option and roll-axis too.

Almost every shot was made with robotic camera heads from Mo-Sys. We used four Mo-Sys heads attached to industrial robots and cranes. For filming in the lightbox we needed a head that reduced shadows on the actors faces. Mo-Sys came up with clever ideas and their reliability and precision was essential for this production.

– Emmanuel Lubezki, DoP for Birdman, Gravity 

With the ultralight design and the high standard performance and all the features of the Lambda head, the Gravity-head was ideal for the special task and ready to meet Sandra Bullock and George Clooney. Equipped with two customised heads and two regular Lambdas, the Gravity team began filming their technically ambitious project in March 2011. In 2013, Gravity was released to wide critical-acclaim, picking up an incredbile seven academy-awards at the 2014 Oscars.

How did the Gravity crew and Mo-Sys work together?

The story of two astronauts marooned in space is told with the camera. “Generally the camera would be moving instead of the person,” says Tim Webber, the VFX supervisor, who also worked on Avatar. The camera floats around the characters and changes from breathtaking wide shots of space to intense close dialogue shots and back again, often within a few seconds.

This kind of camera work required a system that was able to move the camera freely around the actors without putting them in uncomfortable positions. The complex moves were planned out, choreographed and recorded inside Autodesk Maya and later during production played back by the Bot&Dolly robot. “It’s always difficult to translate these virtual moves into the real world,” explains Olli Kellmann, motion control operator. “These things have to be tested, and there are many factors that could mess up a move on set that have no impact on the virtual camera inside the CG environment. Things like the strength of a motor, and how the acceleration and the gravitational force affect the rig.”

To test the waters, Olli and his colleague, Raul Rodriguez, started playing the moves back with 10% of their actual speed and then gradually pushed it up to the full 100%. The slow-motion playback gave them the time to tidy cables that were in the way and make adjustments when the head reached its limit, which according to Olli was rarely the case – another testament to the strong performance of the Lambda head.

In addition the Mo-Sys head gave Chivo the option to load and playback the pre-designed moves. Although these pre-recorded moves were a crucial part, it was also of great value to have enough flexibility during shooting. “You need to be able to respond to what they want out of a shot” says Webber. Reframing a scene, refining camera moves, and reacting to the timing of Sandra Bullock and George Clooney was only possible with the live-operated Gravity-head.

StarTrackerVR empowers Location-Based Entertainment

In recent partnership with HP and utilizing the power of UE4; we’ve further developed our VR tracking technology; empowering location-based entertainment (LBE), gaming, engineering, training and architecture visualisation.

The enterprise-level system called StarTrackerVR, is an inside-out tracking solution like no other. Thanks to our patented optical tracking method, it is not susceptible to occlusion or drift. Using reflective stickers that are stuck to the ceiling, an upwards-looking sensor known as StarTracker is mounted onto the headset, calculating its position in real-time using the stickers as a reference.

This cost-effective method is totally resilient to changes in light and enables wall-to-wall, multi-level tracking across the largest of environments; even when objects are partially obstructing the StarTracker sensor. 

Mo-Sys founder and owner Michael Geissler commented: “The strength, precision and reliability of our tracking really shines through when you can cycle in VR. StarTrackerVR truly delivers unlimited freedom of movement with the power to interact with moving objects or navigate around static obstacles.”

Most recently, the British Motorcycle giants, Triumph chose StarTrackerVR to show a top-secret prototype bike to their most privileged dealers; since no other system ­could provide robust tracking and prevent occlusion issues when multiple users stand close together within the same scene. With around 1200 visitors attending the event, 14 StarTrackerVR systems allowed Triumph to process up to 200 visitors every hour, meaning everyone could see their top-secret concept.

To achieve this, Mo-Sys mounted StarTracker sensors to HP’s Windows Mixed Reality headsets, enabling users to get lost in ultra-sharp visuals with 1440×1440 resolution per eye. In addition, HP Z VR Backpack PCs with their 32GB dual channel DDR4 system memory were crucial to the project. Their ability to work steady under heavy VR data loads provided seamless frame to frame transitions.

IntelliGO created the virtual scene using UE4, which gave it a proper sense of scale necessary to create close to photo-realistic, believable worlds. The power of UE4 blueprints made it possible to implement the multi-user functionality, processing the StarTracker tracking input very quickly and in a user-friendly fashion.

Furthermore, StarTrackerVR has been applied in other areas too; for military training and for immersive multiplayer experiences such as Dark Realities’ interactive, free-roaming experience, which will soon be opened in Birmingham, UK.

The Weather Channel choose StarTracker for Live AR Broadcast

Congratulations to The Weather Channel and The Future Group for pushing the visual boundaries of broadcasting. Using StarTracker for all four of their videos, they were able to use mixed-reality and dramatic informational graphics to elevate their extreme weather reporting. The result was an immersive, three-dimensional demonstration that raised the awareness of the most perilous types of weather—hurricanes, tornadoes, floods and bushfires.

For those interested in how The Weather Channel and The Future Group made their brilliant videos, we have put together an behind-the-scenes case study with a presentation from The Future Group’s Senior Software Engineer,  Justin LaBroad. Just simply leave your email and we will send it to you. In the meantime, check out their series of explainer videos below.


1. A Tornado Hits The Weather Channel – aired on 20th June 2018
Jim Cantore demonstrates a treacherous tornado using a the mixed-reality graphics.


2. Bringing you closer to lightning than ever before – aired on 1st August 2018
Mike Bettes avoids a lightning strike in this instalment of The Weather Channel’s Immersive Mixed Reality experience.


3. Storm Surge Like You’ve Never Experience it Before – aired on 21st September 2018
This was the AR forecast for the Carolina coast as Hurricane Florence storm surge was approaching.


4. How Wildfires Spread – aired on 18th October 2018
Take a look inside at the dangers of wild bushfires with this explosive mixed-reality explainer.


For more information of these explainers please go to our case study on The Future Group, showing how they used StarTracker and created the astonishingly detailed AR graphics for The Weather Channel.                                      

Click here for ‘The Making of The Weather Channel and Future Group’s AR Broadcasts’