FAQs_image
FAQs_image

FAQs

Now with a large, growing community of Mo-Sys users we are always supporting several customers and their projects at any one time. Whether you are already using one of our systems or thinking about purchasing one please see our FAQs below.

Camera Tracking

If you can’t find what you are looking for please get in touch using the contact form at the bottom of the page or via email: info@mo-sys.com.

What is camera tracking?
Precise and reliable camera tracking is an important element when creating realistic virtual studio and augmented reality productions. It is a process that tracks the movement of the camera, enabling virtual graphics to be implemented in a convincing way. Mo-Sys offers a wide-variety of camera tracking solutions from optical to mechanical tracking, meeting any production need or budget.
Whilst both options are very effective their methods are very different. Optical tracking such as StarTracker relies on an additional camera sensor, processing unit and reflective stickers to calculate the absolute position of the main camera. This method gives you much more freedom to operate the cameras anywhere in the studio but the graphics will have noise. On the other hand, mechanical tracking systems are based around the use of measuring sensors or encoders attached to a crane, camera heads or pedestals. These encoders will give angular and positional measurements that allow an operator to accurately track the position of the camera. For more information please read more here.
This depends entirely on what you are trying to achieve and the environment that you are recording in. Optical tracking enables more freedom of movement when shooting in a studio, whereas mechanical tracking gives you more precise tracking but requires more homing and is more suitable for outdoor broadcasting. Mo-Sys offers a wide-variety of camera tracking solutions from optical to mechanical tracking, meeting any production need or budget. For more information please read more here.
Our patented StarTracker system is an optical tracking system which records the absolute position of a camera allowing you to track its movements in real-time. Using the precise tracking data, broadcasters and filmmakers can overlay virtual or augmented graphics whilst having moving cameras.
Again, this depends on the environment you are in as the size of the stickers required relates to the height of the studio. If you are recording in a studio with a low celling such as 3m we recommend using 20mm or 25mm stickers. If you have a much higher studio ceiling such as 10m we recommend using 100mm stickers. Our sticker sizes include 20mm, 25mm, 35mm, 50mm, 75mm and 100mm.
Despite being optical, StarTracker is unaffected by studio lighting, giving you complete freedom to install and adjust lights as needed. In addition, as the “stars” can be applied randomly, in no certain pattern, above the lighting grid, they do not restrict the studio lights in any way.
Yes, when making cutting-edge live augmented reality and virtual studio productions, the alignment between the camera, the lens and the tracking solution is vital. Calculating the distortions, imperfections and characteristics of a camera lens is an important aspect when adding virtual elements into the picture. Lens calibration process prevents virtual graphics from drifting in relation to the real world.
Our simple to learn lens calibration captures and corrects lens distortion by simply pointing at known objects and automatically calibrates all linear and angular offsets of the lens. This then generates a reliable lens file which can be tweaked for future calibrations of the same lens. This highly precise, semi-automatic procedure not only means a faster set-up for users, it also provides clarity for the graphics engine where lens calibration has previously created ambiguity.
To be able to use StarTracker, the camera must be able to support genlock (generator locking) through a BNC socket. This is where the video output of one source is used to synchronize other picture sources together. The aim in video applications is to ensure the coincidence of signals in time at a combining or switching point. The standard genlock we use is analogue. Please check camera features with manufacturers.
All of our products support a growing pool of broadcast graphics engines for virtual implementation: AVID, Brainstorm, Disguise, ChyronHego, Pixotope, Ross Video, RT Software, Unreal Engine, Ventuz, Vizrt, WASP 3D and Zero Density.
Yes, technical support packages are available. Please get in touch at support@mo-sys.com.

Virtual Production

If you can’t find what you are looking for please get in touch using the contact form at the bottom of the page or via email: info@mo-sys.com.


What is Virtual Production?
Often used in big blockbuster movies and heavily visual effects orientated sequences, Virtual Production is a way of shooting for film or broadcast when you simply cannot or chose not to film it for real. This usually involves a combination a green screen or LED wall, camera tracking and a games engine for rendering the virtual scene. It refers to the many techniques that allow filmmakers to plan, imagine, or complete some kind of filmic element with the aid of digital tools. Previz, motion capture, VR, AR, virtual cameras, and real-time rendering are all terms now associated with virtual production.
Having observed the quickly adapting film industry and listened to the needs of directors, producers and VFX supervisors; we have put together a full range of Virtual Production tools for the Unreal Engine. These include: StarTrackerVP, StarTracker MoCap, StarTrackerVR headsets, StarTrackerViewFinder, Handwheels, and extends to all future robotics and tracking hardware.
The Mo-Sys Tracking plugin enables streaming of live motion data from a growing range of Mo-Sys robotic and tracking hardware into the Unreal Engine via Live Link. This data can be used as real-time camera tracking for VFX, virtual studios, game cinematics, architectural visualization or as tracking of VR and AR headsets for warehouse scale virtual reality experiences.

Camera Remote Systems

If you can’t find what you are looking for please get in touch using the contact form at the bottom of the page or via email: info@mo-sys.com.

What camera mounts do Mo-Sys products support?
At Mo-Sys all of our camera remote systems use Mitchell mounts, otherwise known as Moy mounts.
Yes, we can upgrade your Lambda. The upgrade includes new stronger motors and a more robust gearbox. Please bear in mind the motion control feature on the original lambda has been replaced by a new back-pan feature.
Yes, both of these heads provide real-time data tracking the pan and tilt movement of the head. This can be used for real-time previs.

Make an Enquiry

For more information about how we can help your project, please contact us using the form on the right.

I agree to the Mo-Sys privacy policy