To view this page ensure that Adobe Flash Player version 11.1.0 or greater is installed.

NEWS MANIPULATE & EDIT ANALYSIS As you will read, Fairlight has revealed that it is collaborating with DTS and the University of Salford on a way to help sound engineers with live sports production. I really like this story as it highlights actual innovation and could provide genuine improvements to the way we work. The approach they are working on is called Spatial Automated Live Sports Audio (SALSA). It is a real-time automated mixing process that identifies the location of specific sound events from a grid of pitch microphones. The algorithm, developed by the University of Salford, identifies the type of sound event, its 3D location, and its duration, and automatically drives console fader movements to open the relevant mic. Why is this interesting? Leaving pitch microphones at a fixed level can result in off-pitch crowd noise masking on-pitch sounds in the broadcast mix. With the introduction of even more mics and immersive object-based audio, it will become even more challenging to manually create the best possible mix. SALSA will help address this problem by allowing different game sounds, such as ball kicks and referee whistles, to be processed automatically by the mixing console. SALSA can be adapted to search for different sounds, allowing the automated mixing to be applied to different sports. It’s no pipedream either. Fairlight has already integrated SALSA into its next-generation live production systems, supporting both conventional and object-based broadcasts. “By combining cutting edge technology from our three organisations, the SALSA project automatically translates pitch mics into 3D audio objects,” says Tino Fibaek, chief technical officer at Fairlight. “This allows broadcast mix engineers to focus on the overall mix, whilst the system does the hard labour of extracting the best possible sound from the pitch for sports aficionados.” So, it’s an innovation that means everyone is a winner. What’s not to like? VR EDITING At NAB Mettle debuted Skybox 360/VR Transitions for Adobe Premiere Pro CC, a plug-in package that provides delivers transitions for editing cinematic 360 content. The available options include: Image-Based 360 Gradient Wipe, 360 Random Blocks, 360 Iris Wipe and 360 Mobius. Many of the transitions have X,Y start points, to help storytellers to direct the viewers’ gaze. “Creating seamless transitions for 360 content can be very challenging for the editor,” said Chris Bobotis, co-founder, Mettle. “Finding transition solutions is very time-consuming, especially for artists who are new to the medium. Our plug-in provides a set of ready-made transitions for Adobe Premiere Pro CC editors, to get them past those first hurdles. Equally important, these transitions provide ideas and direction for storytelling techniques that work for VR.” ON-SET GRADING Codex has combined its Codex Live colour management and look-creation system with OffHollywood’s OMOD in order to offer a new set-to-post workflow for projects shot on Red Weapon cameras. Codex Live is used for grading and previewing images taken directly from live HD-SDI camera feeds while OMODs are modules for the Red Weapon platform that route HD-SDI outputs. The tie-up also provides users with the ability to do live High Dynamic Range (HDR) on-set grading. “We developed Codex Live to meet a need for secure colour pipelines that are integrated into the production to post workflow, so that the look created on-set is exactly what appears in the VFX, editorial deliverables and in the DI grading suite,” said Brian Gaffney, VP business development at Codex. Codex has also launched version 4.5 of the Codex Production Suite. 14 | KITPLUS - THE TV-BAY MAGAZINE: ISSUE 113 MAY 2016