The party is not over until the video is online

Paul Scurrell

Author: Paul Scurrell

Published 19th October 2017

The party is not over until the video is online

When we were introduced to the team behind Sensation dance events in Amsterdam they had one big question for Timecode Systems: could our SyncBac VR wireless sync and control solution theoretically make it possible for a high quality, professional 360-degree video of a live dance music event to be edited in the time it takes the DJ to fly to his or her next gig? Intrigued by what our solution could help them achieve, in July they invited us to film at one of their iconic dance music events. Billed as The Final it was a sell-out farewell party to mark the last Sensation event in Amsterdam.

However, there was a unique twist in their brief. In addition to a 360-degree virtual reality (VR) video, they also wanted to produce a variety of standard platform content. The aim was to create a suite of videos for sharing online and across social media platforms. In the music industry today, a spectacular live performance isnt enough; the DJs online and social media newsfeed can be just as important as their playlists. The party doesnt end when the DJs pack away their decks; theres an expectation that, through exclusive videos and immersive content, its possible for anyone to relive the performance online as if they were there. And for this content to have maximum impact, it has to be available to upload and share quickly. .

Sensation were looking for a solution that would allow them to produce a range of unique 360-degree and standard platform video content, to professional standards, with speed and efficiency, and requiring minimal manpower. This is exactly what the Timecode Systems solution is built for. .

On Saturday 8 July, our two-person crew headed to the Amsterdam Arena, dressed head-to-toe in white (a unique requirement for all attending Sensation events) and on a mission to create a unique video experience for the dance brands significant fan following. With a team of just two filming, the kit had to be manageable yet extensive enough to effectively capture the event. All video was captured using GoPro HERO4 cameras. For 360-degree content, two 360RIZE SyncBac VR spherical arrays were used; a PRO10 rig holding 10 GoPro HERO4 cameras with SyncBac PRO timecode units attached, and a PRO7 rig with seven cameras and SyncBac PROs receiving timecode. These VR rigs were positioned on the DJ podium and with the orchestra. Multiple separate GoPro HERO4 cameras with SyncBac PRO receiver units were positioned at key locations around the venue. All cameras were synced to a Timecode Systems :pulse, which acted as a mini base station generating the master wireless timecode for all video and sound sources. .

A significant benefit of this set-up was that, at the end of filming, there was the option to use the synchronised footage from each of the individual cameras in the 360-degree rigs to create standard platform videos too, providing the requested flexibility to efficiently create a whole range of engaging content. .

Using the :pulse base station also allowed the team to monitor and control all cameras centrally and simultaneously as a group from the BLINK Hub device control app. With WiFi being unpredictable and overcrowded in this environment, the :pulse was connected to a MacBook via Ethernet, providing a reliable connection to the BLINK Hub. The ability to easily check battery levels and amend camera settings from a single screen greatly reduced the guesswork that can often lead to camera errors being overlooked when using multicamera VR rigs.

And then there was sound. For 360-degree videos to be truly immersive, they need realistic audio that matches perfectly with the visuals, otherwise the brain doesn\'t buy into the illusion. So, to complete the system, a Sound Devices mixer/recorder using an UltraSync ONE timecode unit was set up to capture sound. .

This configuration allowed all cameras and sound to stay synchronised throughout filming over long range RF. After recording had finished, the timecode-stamped GoPro MP4 files from each camera were handed to our editor ready to be imported directly and easily into Kolors Autopano Video Pro for stitching and editing. No other manual work to align GoPro content was required during shooting or ahead of stitching. And with audio and video timecode synchronised, video content could be automatically and accurately aligned with separate stereo, DJ-mixed audio and ambisonic crowd atmos, making it easy to replicate the natural listening experience in the finished videos. .

The time saved by eliminating manual sync processes proved that immersive aftershow video content for music events and festivals can be produced efficiently, cost-effectively, to tight timescales, and with minimal manpower. Unfortunately we cant promise the same pain-free after-party experience for the audience.

Related Articles

Related News

Related Videos

© KitPlus (tv-bay limited). All trademarks recognised. Reproduction of this content is strictly prohibited without written consent.