Strictly Augmented Reality

It is one of the greatest global television hits of the last 15 years. First seen as Strictly Come Dancing, produced by BBC, it is now licensed around the world, as Dancing with the Stars or its equivalent in local languages, and is massively popular.

Strictly Come Dancing Mo-Sys

In the UK, Strictly is the most popular entertainment program on television. For the 2020 season, the producers wanted to add even more glitz and glamour, and were considering upping the graphics game even before the full extent of the Covid-19 pandemic meant the production plan had to be completely rethought.

Graphics specialist company Potion Pictures was already a part of the Strictly production team, building excitement through the screens which are part of the set, and through the dramatic floor projections. Without the usual enthusiastic studio audience, it was clear that the graphics were going to play an even bigger part than usual in building the atmosphere.

Because the programme is broadcast live, there is a very short period between each dance in which to put specific furniture and props in place. Social distancing requirements meant a smaller crew on the studio floor, so physical props had to be cut to an absolute minimum. The producers turned to augmented reality (AR) to make the scenes sparkle.

From steam trains to elephants, the 2020 series of Strictly Come Dancing went all out to provide the visual wow factor and make up for the lack of a live audience. In this case study session for MPTS, Potion Pictures explained how they delivered their most exciting and extravagant graphics to date to bring the series to life using Mo-Sys StarTracker and VP Pro.

Mo-Sys had previously worked with Potion Pictures on a POC (proof-of-concept) in another studio at BBC Elstree. This provided the opportunity to show to the creative and technical team exactly what could be achieved with live real-time AR virtual graphics – and it made them see what really could be possible.

There are layers of complexity in adding AR to Strictly. First, it is primarily a dance competition, so Mo-Sys could not get in the way. The digital objects had to be moved to match the choreography. More important, nothing could obscure the dancers: if the voting audience could not see what the partners were doing, it could have huge consequences for the fairness of the judging.

Second, the show was already full of shine and sparkle, and nothing could affect that. If the video processing to composite real and virtual elements together degraded the live pictures, that too would be unacceptable.

Third, whatever Mo-Sys did had to be in absolute real time, with the minimum of latency. The show depends on music, and the audience would not tolerate pictures out of time with the sound.

The system built for Strictly used the Mo-Sys VP Pro software, designed to integrate directly into Epic Games’ Unreal Engine editor interface. This system provided the real-time compositing and 3D graphics rendering of Potion Pictures’ creative vision.

Camera tracking was provided using Mo-Sys’ StarTracker 6-axis tracking system. The augmented reality images were linked to two of the live cameras: one on a crane, the other on a Steadicam. These are the most challenging camera mounts, which added to the excitement.

The way that the production decided to use AR was to introduce elements at the start and end of the dance. A whole range of virtual graphics was created for the series, from an ice castle and a Barbie house to a train and – bizarrely – an elephant.

For augmented reality to be completely convincing there has to be no visible join between the real and visible worlds. The graphics have to be rooted firmly in position. Strictly was a very testing environment for the Mo-Sys technology; the combination of a dark studio ceiling (where the camera tracking markers are located), rapidly changing lighting, glitterball reflections, meant that this was probably the sternest test of tracking accuracy and integrity possible. It was agreed that there were three key requirements if the virtual graphics illusion was going work and be maintained:

  • ultra-precise camera tracking on all 6-axes of movement
  • extremely accurate lens calibration, so the virtual graphics are distorted to exactly match the camera lens attributes, particularly during zoom shots
  • excellent synchronisation – the camera frequently moves quickly to keep the dancers perfectly framed, and the virtual graphics had to keep up

Mo-Sys delivered on all three. Indeed, the Mo-Sys synchronisation above all else realised this ambition – proving that without this technology the results could not have been achieved.

Mo-Sys Technical Director, James Uren, supervised the system over the three studio days for every episode. That gave the company a remarkable insight into how real, creative production people want to use Mo-Sys equipment, pushing it to the limit. It meant continually refining what could be done, particularly in association with the Unreal engine.

The ice castle saw some incredible refractions enabling the development of ever-better translucency on particle effects. The models had dynamic shadows, again to make them sit naturally into the live studio. This and other continuing improvements were because of the ongoing understanding and recognition of exactly what Strictly was trying to achieve, and interfacing with Epic Games to make even better use of the very latest Unreal Engine features.

Besides the AR graphics, Potion Pictures also used a second type of tracked real-time graphics enhancement. In previous series of Strictly, viewers would have seen the floor displaying a ‘drop-away’ 3D graphic of the top of a building, on which the dancers would appear to perform their routines. However, the camera would never move, and the 3D visual impression was more symbolic than photo-realistic. This series however used an off-axis tracked camera, again utilising the Mo-Sys StarTracker system, meaning that this time the camera could move, and as a result it gave the very realistic impression that the dancers were performing on top of a building.

This series of Strictly produced several key technological ‘firsts’ for virtual production use in a live broadcast:

  • The use of Epic Games’s Unreal Engine natively to produce the real-time graphics
  • The use of Mo-Sys’ VP Pro plug-in software to deliver the real-time compositing and synchronisation
  • The use of Unreal Engine’s image distortion features to show the real studio distorted through ice AR graphics

It is fair to say that AR on Strictly was a learning experience for everyone. But there is no doubt that it added yet more glamour and visual interest to this iconic programme, at a surprisingly low cost. With minimal addition to the programme budget, AR added a huge amount to the visual richness.

Garden Studios doubles its footage to meet demand

New site now includes a virtual production studio powered by Unreal Engine

Garden Studios has announced that it has increased its size and expanded by a further 65,000 sq ft, providing a total of 127,000 sq ft of studio space. This is more than double the original site.

Now known as The Orchid Production Village, the new site will be available from next month and comprises eleven serviced offices, two shooting spaces (totaling 40,000 sq.ft) and a 16,000 sq.ft workshop. This is in addition to the already existing three soundstages, three shooting spaces, four large fully equipped production offices and workshop units.

The new creative hub offers some of the best facilities whether shooting a feature film, television production, short-form promo, video game, commercial or music video. Located in Park Royal, Central London, the studio is just seven miles from Soho, and a short cycle for the many creative West Londoners, this innovative facility provides a convenient and creative haven for international talent.

Garden Studios have also announced the provision of a virtual production stage that is now fully operational and open for bookings. Powered by Unreal Engine, the world-class virtual production stage is 4,800 sq ft and offers a flexible, safe, and socially distanced shooting environment which is a sustainable and cost-effective filming option that opens up unlimited creative possibilities.

Virtual production allows filmmakers to shoot live action footage within a pre-visualised virtual world. 3D modelling and camera tracking places actors and props in virtual settings, removing the need to apply CGI in post-production. Real-time editing means environments and locations can be customised with the touch of a button.

Garden Studios has collaborated with industry experts including:

  • Mo-Sys Engineering: An advanced image robotics and virtual production technology company
  • Brompton Technology: Manufacturers & suppliers of market-leading and award-winning Tessera LED processors for LED panels and video walls
  • ProCam Take 2: Europe’s leading digital cinematography hire facility
  • Quite Brilliant: A multi-discipline agency specialising in virtual production
  • ROE Visual Europe: An award-winning LED display technology manufacturer

Garden Studios are now ready to embrace clients and accommodate all film, television and virtual production bookings. The doors are open.

Studio Manager, Marnie Keeling, said: “We are excited to expand our site, this enables us to accommodate a wider scale of clients, providing options from shooting spaces, workshop and production offices through to soundstages. Garden Studios intend to create impact across the industry through innovation, technology and world class sound stages. In addition to this our virtual production stage allows you to shoot anywhere in the world all from the comfort of our studio. We’re thrilled to support future, traditional and virtual filmmaking.’’

Founder and CEO of Garden Studios, Thomas Hoegh, said: “The UK’s increased popularity as a filming destination means that expanding our studio capacity is now more essential than ever. We can accommodate all size productions across our 127,000 sq ft of sound stages, workshops and fully equipped production offices. While the virtual production stage allows you to shoot anywhere in the world all from the comfort of our London studio. We are looking forward to welcoming productions and seeing the innovative creations they produce.”

Mark Pilborough-Skinner, Virtual Production Supervisor, said: “Virtual production is the natural successor to greenscreen and a sure staple of the film industry for years to come. We have worked incredibly hard – with some of the best industry minds – to develop a studio with impeccable workflow and infinite possibilities.”

Racing TV migrates to a new virtual studio

Racecourse Media Group (RMG) has unveiled a new, high-end virtual studio, set up with the help of Timeline and Moov, which is destined to take horseracing TV coverage to the next level.

Supplied by Timeline TV, Mo-Sys StarTracker systems are in use in the new Racing TV virtual studio. Making a small studio appear to be a large studio is one of the many advantages of virtual production. StarTracker’s Sensor has a wide degree field of view, which enables it to cope with challenging camera angles in small spaces, and, of course, in larger spaces.

Racing TV’s Rachel Casey presents from the new Racing TV studio

Built at the home of the Racing TV channel, at Ealing Studios, West London, the studio has been created in collaboration with Timeline TV, which has been RMG’s technical managed services provider since 2012, and MOOV, industry leaders in broadcast graphics and virtual studios.

It uses virtual technology from Unreal Engine and Brainstorm InfinitySet, originating from the gaming industry.

At a relatively small space (6m x 6m) the studio appears expansive in its virtual guise, with four operational areas. The Mo-Sys StarTracker system enables extra flexibility in camera angles, and the types of cameras, even jibs, that can be in the studio.

The over-riding objective of the new studio is to take viewers as close to the racecourse action as possible, without physically being there. The new studio will work as a hub by bringing in all of Racing TV’s race-day feeds, with reporters at every course, creating a ‘watch together’ racing community experience.

The studio has four specific zones:

  • A main presentation desk for a panel of up to three guests per live programme
  • A stand-up position with augmented reality big screen, to integrate data and analysis into panel discussions
  • Timing information data area for analytical shows such as The Verdict
  • A panoramic area for festival coverage and the new home for shows such as Luck on Sunday

Adam Binns, Director of Broadcast and Production at RMG, said: “The studio will enable Racing TV to deliver immersive and interactive horseracing presentation coverage to our audiences.

“The virtual and augmented reality environment helps us to keep the racing fraternity closely linked on an ongoing basis and will be the perfect platform for video-conferencing and other remote tools, which are so vital now.

We are very grateful for all the support and expertise which Timeline TV and MOOV have contributed to the project.”

Photo credit: Tom Aizenberg

David Harnett, Head of Operations at Timeline commented: “This has been a hugely collaborative project between RMG, MOOV and Timeline TV. Racing TV required a studio that would be flexible and adaptable to allow them to create different sets for a variety of their horseracing shows.

“We already work alongside MOOV at BT Sport’s Broadcast Centre, so we had no hesitation in collaborating with them and relied on their expertise and knowledge to deliver a truly immersive and innovative virtual reality studio.”

Commenting on the new virtual studio, Nev Appleton, co-founder of MOOV, said: “As we enter our 21st year, we are delighted to have worked with the Timeline team to create and deliver such an exciting and innovative virtual studio for Racing TV, which will further enrich the experience for all racing fans.

“Racecourse Media Group are focused on expanding the multi-platform presence of horse racing to help grow the sport in the UK and Ireland and this is something the MOOV and Timeline team are passionate about supporting.”

Racing TV broadcasts coverage of racing from 35 British, and 26 Irish, racecourses and is available in the UK on Sky, Virgin and Apple TV.

HPA Tech Retreat 2021

Mo-Sys to Introduce a Near-Time Virtual Production Workflow at HPA Tech Retreat 2021

15 – 24 March 2021

Mo-Sys Engineering is proud to be a gold sponsor of the inaugural 2021 virtual HPA Tech Retreat (15 – 24 March) and is looking forward to being part of this year’s HPA sessions, hosting a thought leadership presentation on near-time virtual production workflow.

“The HPA Tech Retreat is widely recognized as a prestigious occasion and a great opportunity for the most innovative professionals in the industry to come together and have open and collaborative conversations,” said Michael Geissler, CEO of Mo-Sys. “We’re very excited to be taking such a prominent role in this event, especially at a time of rapid changes and turmoil when we’re seeing an increase in creative solutions.”

Today virtual production (VP) workflows are clearly split into two camps: real-time (on set finishing, final pixel, in camera VFX) and non-real-time (on set previz followed by post-production compositing). Real-time workflows force a graphics quality threshold in order to maintain real-time playback frame rates, which works well for most scenarios, but not so well when fine detail is involved. Non-real-time workflows provide an increase in graphics quality, but at the expense of significant increases in time and cost. Mo-Sys will be presenting a new dual pipeline virtual production workflow, one that combines a real-time onset VP workflow with a near-time VP workflow running in parallel, and will be explaining how it delivers higher quality graphics without the delay or cost of using a traditional post-production workflow.

Additionally, Mo-Sys will be leading breakfast and lunch roundtable sessions on ways to automate virtual production to increase graphics quality, and how to get enhanced resolution on LED walls without heavy hardware investments, and is joining with AWS to discuss virtual production in the cloud.

Full information on the event can be found here.

Mo-Sys Launches StarTracker Sports Studio

A unique integrated real-time graphics solution for sports presentation and production

Mo-Sys Engineering and HYPER Studios, leaders in cloud broadcast graphics, have created the first virtual production system with data-driven sports graphics.

StarTracker Sports Studio combines a complete virtual production system, based on Epic Games’ Unreal Engine, and capable of generating moving camera virtual studios, augmented reality (AR), and extended reality (xR), with a state-of-the-art HTML5 sports graphics system.

An ever-increasing number of sports broadcasters use virtual studios and AR with data-driven graphics, where each graphic type is generated by two separate graphics systems. However, rather than simply keying standard overlay graphics on top of a virtual set, sports broadcasters are now asking to integrate data-driven graphics inside the virtual studio; for example, making them appear inside a virtual monitor or synchronizing them to an AR animation where the graphics need to match the style and branding of the AR virtual graphics.

StarTracker Sports Studio enables in-context design of photo-realistic virtual graphics with data-driven sports graphics, simplifying the creation and operation of this new approach to virtual sports graphic presentation.

“We have engaged with many sports broadcasters in order to capture the full spectrum of graphics functionality and workflows that they need now and going forwards, in order to cover major events,” said Michael Geissler, CEO of Mo-Sys. “Working with HYPER Studios, we have designed a system that allows a sports broadcaster to create integrated and in-context virtual graphics content, and as a result produce more engaging and differentiated content for their viewers.”

Mo-Sys will launch StarTracker Sports Studio at the SVG Europe Sports Graphics Summit on 5 March. It is available to order 2 April 2021.