At IBC2018, Mo-Sys will combine Unreal Engine’s interface and Live Link capabilities as well as the ray-tracing render strengths of Chaos Group’s V-Ray with its real-time camera tracking system, StarTrackerVFX, for a complete, automated virtual production workflow. This will demonstrate how all-sized budgets can now access Hollywood-quality VFX with one simple workflow.
Mo-Sys’ StarTracker camera tracking technology and camera robotic products will also be exhibited at six other booths around IBC2018, including Aximmetry, Brainstorm, Grass Valley, RT Software, White Light and Zero Density.
On its stand, Mo-Sys will showcase the latest photorealistic imagery and animation technology for a virtual scene. To achieve this, they will film an actor in a StarTracked MoCap X-Sens suit in front of green screen, capture the camera tracking data and directly feed it into Unreal Engine in real-time. The shot will be automatically re-rendered in V-Ray using parallel near-time ray tracing, which provides a superior render. This integration enables studios to be more daring with on-set creativity, rather than find out what is possible during the expensive post-production stage. In addition, Reallusion will provide live full body motion capture, allowing 3D characters faces, hands and bodies to be tracked simultaneously.
StarTrackerVFX is a real-time camera tracking tool with a direct plug-in for Unreal Engine that enables filmmakers to place real people within photo-realistic environments. Mo-Sys’ integration with Chaos Group further democratises this access to virtual production. Projects with more modest budgets – such as telenovelas, commercials, corporate videos, image films, architectural visualisations and micro-budget feature films – can create a superior render quality in real and near-time, in one simple workflow.
Chaos Group’s V-Ray for Unreal product manager Simeon Balabanov commented: “StarTrackerVFX and V-Ray for the Unreal Engine gives users the best of both worlds by streamlining a usually complex process. It allows users to render ray-traced, photorealistic images with V-Ray directly from the Unreal Engine, which cuts out the additional costs normally associated with filming on a green screen.”
Mo-Sys founder and owner Michael Geissler added: “Before StarTrackerVFX, it was difficult and expensive for studios to film on a green screen when the camera wasn’t stationary. But StarTrackerVFX’s integration with the likes of Chaos Group has now given all-sized budgets the ability to precisely match the real camera’s position, orientation and lens distortion with the virtual world.”
In addition to automated re-keying and re-render using V-Ray ray tracing, StarTrackerVFX now provides real-time camera tracking, the ability to track multiple objects including lights, motion-capture suits and a directors’ viewfinder. Its plugin with the Unreal Engine features pre-viz keyer, garbage mattes, lens distortion and FBX data recording for post-production.
Currently in its 4th generation, StarTracker camera tracking technology has further improved with a new auto-alignment tool and an advanced smart lens calibration technique. This more accurate lens calibration method for a zoom lens typically identifies one million sample points no more than 1mm x 1mm; which is 70,000 times more than the industry standard calibration technique, which only uses 500 sample points.
Mo-Sys’ tools are built on its patented and industry proven StarTracker technology that powers virtual TV studios around the world and lets users remove complexities, time and budget constraints. Its solutions connect to the industry’s leading virtual studio engines from Vizrt, Brainstorm, Zero Density, Avid and RT Software.
StarTracker is used by more than 100 broadcasters worldwide including BBC, Sky, ESPN, FOX, CNN, NHK and ZDF. StarTrackerVFX and its V-Ray integration will be demonstrated from 14-18 September on Mo-Sys stand 8.F10 in The Future Zone at IBC2018 in Amsterdam.