Object Audio - whats all the noise about


Bruce Devlin TV-Bay Magazine
Read ezine online

We all know that getting the audio right makes the pictures better. Anyone who has seen a movie created for Object Audio like Dolby Atmos will know that there is something special about it. To figure out what that is, let's rewind a little and see why we might need it.

Remember the good old days of mono audio on TV? Mono audio gives a single channel of sound played back through one or more speakers. That's fine for many genres, but to increase the feeling of being in the scene, stereo audio was introduced, so that left and right channels are transmitted, giving our ears the ability to hear sound moving on either side of us like we do in real life while we watch the images.

With the advent of cheaper electronics, we moved to a surround sound system where we added more speakers to give the impression that sounds could come from in front and behind as, as well as either side of us. Typically, today these surround sound systems use 5.1 channels. This means that we have 5 full bandwidth channels:

  • Left Front
  • Center
  • Right Front
  • Left Rear
  • Right Rear

And one low bandwidth channel (this it the ".1")

  • Low Frequency Effects (LFE)

This system essentially takes a sound mix from the program creator that is mastered on 5.1 speakers and then maps it on to the 5.1 speakers in your home or in the theater.

Audio processing technology is now very cheap your cellphone can do more audio processing than dedicated hardware from the 1990s. We have much bigger screens today, and with the rollout of UHD, it is likely that we will be sitting even closer to them. To increase the sense of "being there," adding a vertical element to the sound can make a dramatic difference.

It is impractical to move from a situation where we have six fixed speakers to one where we have hundreds of speakers that position the sound exactly, especially when most of the time there will be little or no sound coming from an individual speaker. Imagine instead a "bed" of audio that is the traditional stereo, or 5.1, mix that you add effects or objects to with an audio stream and some control metadata.

In its simplest form, the UK's audio description service does just that. Start with a stereo "bed" that is the normal program mix and then add a description for the visually impaired and a control track that adjusts the volume and position (left-right pan) so that a smart decoder can mix the sound together.

A full cinematic system is very similar, but with more objects and more metadata. Each cinema may have a different number and position of speakers depending on budgets. The object sound system provides the "bed" of audio that is mapped onto the speakers in the theater. Each object sound is then mapped to one or more physical speakers at the right time and the right volume to provide very specific spatial effects for the audience.

Knowing these basics, you can see how this system might map to a consumer setup where there will be fewer speakers, but by calibrating the room, a sound processor could do a good job of mixing the sound between spatial speakers and upward firing speakers to give a pretty good approximation to the 3D sound experience in the cinema.

Currently, we send our content to the listening / viewing environment in a fairly linear way. The mix that is created at the content provider is the mix heard / seen by the viewer. Technologies like IMF are enabling content creators to produce and distribute versions more cheaply. Technologies like object sound with consumer audio processing units allows different objects like languages, high dynamic range effects (for quiet environments) and low dynamic range effects (for noisy environments) to be selected and / or mixed at the receiving point. We're increasingly moving from a linear "here's my content" world to a component "which bits of the content will give you the best experiences world. It's a fun time to be in technology.


Tags: iss111 | object audio | dalet | mxf | Bruce Devlin
Contributing Author Bruce Devlin

Read this article in the tv-bay digital magazine
Article Copyright tv-bay limited. All trademarks recognised.
Reproduction of the content strictly prohibited without written consent.

Related Interviews
  • Dalet at NAB 2016

    Dalet at NAB 2016

  • Dalet at IBC 2015

    Dalet at IBC 2015

  • DALET ACADEMY NAB 2015

    DALET ACADEMY NAB 2015

  • DALET SportsPack at NAB 2015

    DALET SportsPack at NAB 2015

  • Dalet at IBC 2014

    Dalet at IBC 2014

  • Dalet at NAB 2014

    Dalet at NAB 2014

  • Dalet at IBC 2013

    Dalet at IBC 2013

  • Dalet at NAB 2013

    Dalet at NAB 2013

  • Dalet at NAB 2012

    Dalet at NAB 2012

  • NuGen Audio at IBC 2013

    NuGen Audio at IBC 2013

  • Arkivum at BVE 2012

    Arkivum at BVE 2012


Articles
AI in Media and Entertainment
David Candler Artificial Intelligence (AI) is a term appearing everywhere these days. What is happening in media and entertainment (M&E) that makes the industry ripe for AI? In other words, why does the M&E industry need AI?
Tags: iss134 | AI | wazee | David Candler
Contributing Author David Candler Click to read or download PDF
An Obituary to Timecode
Bruce Devlin - new A stoic and persistent character that stubbornly refused to change with the times, Timecode has finally passed on, but no-one has noticed. A long-lasting industry veteran, Timecode was brought into this world at an uncertain date in the late 1960s due to the needs of analogue tape workflows and the demand for synchronisation between audio and video devices. A joint activity between SMPTE and the EBU led to the work on Time and Control codes starting its journey to standardisation in the early 1970s.
Tags: iss134 | timecode | smpte | ebu | edit | Bruce Devlin - new
Contributing Author Bruce Devlin - new Click to read
Giving Welsh sport a global audience
Adam Amor From the Ospreys Rugby Union team, to the Football Association of Wales, as well as national cycling, swimming and boxing coverage, Port Talbot based Buffoon Film and Media has been heavily involved in putting Welsh sports on the world stage.
Tags: iss134 | blackmagic | atem | buffoon | micro studio camera | Adam Amor
Contributing Author Adam Amor Click to read or download PDF
Keeping it remotely real
Reuben Such Everyone wants to do more with less. Always have, although it could be argued that doing more with more is something to aspire to, not many have that luxury. So let’s stick with the prevailing winds of doing more with less, and not just doing more, but doing it remotely, particularly in terms of production. Remote production, in particular, is getting a lot of attention in the field these days, but not so much in terms of the remote operation of fixed studios.
Tags: iss134 | remote control | IPE | IDS | Reuben Such
Contributing Author Reuben Such Click to read or download PDF
Accelerated Workflows with eGPU
Mike Griggs From the UK’s National Trust to magazine publishers to manufacturers, digital content creator Mike Griggs has a wide and varied portfolio of clients for whom he creates 3D art, motion graphics and multimedia exhibits. A typical day might involve sampling birdsong near Virginia Woolf’s country estate or creating 3D animations for VR. To keep on top of these demands, Griggs wanted to take the full power of the GPU computing revolution on the road.
Tags: iss134 | sonnet | egpu | amd | post production | editing | Mike Griggs
Contributing Author Mike Griggs Click to read or download PDF