VR and 3D Audio - Ask The Experts


Pieter Schillebeeckx TV-Bay Magazine
Read ezine online
by Pieter Schillebeeckx
Issue 113 - May 2016

What's the difference between 2D & 3D audio?
There are two parts to this question when it comes to audio for VR. The key difference is that 2D is a single horizontal slice,so when we're thinking 5.1 or traditional surround sound in a cinema that would be looked at as 2D,whereas 3D adds height information to this both above and below you. The second part to this question relates to static versus dynamic audio,and this goes for both 2D and 3D audio. Until now,we've been used to consuming audio in a static manner to match a static image. With VR the image is no longer static as it tracks head movement,making for a dynamic experience. For virtual reality it's the dynamic nature of the audio that's extremely important and will complete the immersive experience.


Is 3D audio the s ame as object-based audio?
3D audio can be part of object-based audio but they're not one and the same. 3D audio-often referred to as immersive audio-aims to transport a listener to an environment,immersing them in the sound,whether at a concert or a basketball game.
Object-based audio is a radical departure from traditional audio formats,such as stereo or 5.1,in two important ways:it supports many audio playback formats natively from one single audio deliverable; and it offers personalisation. In order to achieve this,an object-based audio stream is not a pre-baked stereo or 5.1 mix but rather a selection of audio stems that are used by the consumer's device - such as a set-top box or a mobile device-to create the desired playback format for the listener's set-up,whether that's headphones or a home theatre.


If we're looking at all the different playback formats-including virtual reality - it's clear that we can't keep on creating more and more mixes,so object-based is definitely the future for audio,whether delivered to a VR headset or the ultimate home theatre.
I think a very important part of object-based audio is the personalisation element. For example,by sending multiple commentator stems for a football game you could say: "Well I don't want that neutral commentator,I'm a Liverpool fan so I want the Liverpool biased commentary." To take it one step further,you can also set the balance between the background ambience-the feeling of being there-and the commentator
of your choice.


So is 3D audio the same as object-based audio?

No it's not. 3D audio can be part of an object-based deliverable,and 3D audio as an ambience bed works extremely well in an object-based environment because you can augment it with mono or stereo stems such as sound effects or narration.

I understand 3D audio is not a new technology, when was it developed and for what reason originally?


3D audio has been around for quite a long time. If you look at surround sound as a whole,Disney's Fantasia introduced surround sound in the 1940s.
SoundField developed the very first ambisonic B-format microphone in the late 70s,with the first commercial product coming out in 1978. And they were fully 3D audio compliant even back then so it's nothing new. The challenge then was what to do with the 3D audio,how to play it back outside of a laboratory. The early use of 3D audio was not about being immersive but about the flexibility it gave you when steering around this microphone. You may only want a mono or stereo output from this 3D audio capture but it's about being able to steer around and being able to reposition this microphone in post-production.
This is a very important statement because this is where SoundField and virtual reality really start to gel together. So if you think about it,the way virtual reality is captured from a 3D video point of view means we can smoothly move around this space. SoundField B-format captures audio in exactly the same way allowing us to use exactly the same head tracking or positional data used to position the video to move the audio perfectly in sync.
All we have to do for this is use the four SoundField B-format audio channels together with the video. The head tracking information,which is used to move the video around,can then be used to steer the audio.

If I'm wearing VR goggles I need to hear sound behind me when I turn to see what these sounds are then they should be in front really,so what are the challenges in doing that and maintaining this audio interact?


There are a lot of different ways you could do audio for VR,and as you progress down these different ways the experience will become more realistic for the consumer,which is the end goal.
So first of all,we could just have a fixed stereo which doesn't move with the video so you just lay down a stereo track like you would have always done for a standard video shoot. So a lot of the VR content out there is exactly that:you move your head and the audio stays completely static. Clearly this is not satisfactory and we are really missing
out here,in the end it is the audio that will make you really believe you're in a virtual reality.


The second thing you could do is to use head tracking to play back stereo audio that is in line with what you're seeing.
As you move your head,the audio will pan around in sync with the video. At this point you won't really hear a discrete source behind you over headphones because it's just a stereo image which is facing forward-or if you do hear the sound you will not localise it behind you. Again,this is an improvement and there is more and more virtual reality material available that is done in this way but clearly it's not the holy grail.
The true holy grail is really about being able to recreate complete 3D audio over headphones. Binauralisation aims to do exactly this and mimics spatial cues generated by your head and your ears to trick your brain into hearing real 3D audio. This technology again has been around for a long time but has been fraught with challenges in that every person's head and ears are different. When you measure a given person's head the results are extremely convincing. However coming up with a set of measurements that work for a wide range of people has been challenging,but a lot of progress has been made in recent years.


So this is where it starts to get very exciting. We can capture 3D audio using a SoundField B-Format microphone,we can use object-based audio to augment with other mono or stereo sources and we can playback 3D audio over headphones that can use the video head-tracking data to move them all in sync. Now we are really starting to be immersed in a virtual reality both from a video and an audio perspective!
So from where I am standing it looks like the technology is available to go out and create truly immersive virtual reality experiences. All we need now is lots of creativity to make amazing content.


Tags: iss113 | 2D Audio | 3D Audio | Object-Based Audio | VR | Virtual Reality | Pieter Schillebeeckx
Contributing Author Pieter Schillebeeckx

Read this article in the tv-bay digital magazine
Article Copyright tv-bay limited. All trademarks recognised.
Reproduction of the content strictly prohibited without written consent.

Related Interviews
  • Sennheiser VR at IBC 2016

    Sennheiser VR at IBC 2016

  • Elemental Technologies VR at NAB 2016

    Elemental Technologies VR at NAB 2016

  • Virtual and Augmented Reality support with the Arrow Fx7 from Miller at NAB 2017

    Virtual and Augmented Reality support with the Arrow Fx7 from Miller at NAB 2017

  • Ross Video at BVE 2017

    Ross Video at BVE 2017


Articles
An Epiphany Moment
Peter Savage 2 I had been negotiating the sale of my company and had reached the really hard end of the bargain. We were close to agreeing the final sum after a lot of too-much-give-and-not-enough-take negotiation. The solicitors were calling me, keen for a deal. It had come down to one sticking point and, in my hard ball “I am the Wolf of Wall Street” guise, I wasn’t going to let it go. It would make a value difference of 1.5% on the total outcome. Not much, you might think, but I had already nearly fallen out with the solicitors over their fees and I was giving my advisors an extremely hard time because the corporate adviser couldn’t see how I had already given more than an inch and the buyers were taking more than a mile. I was not going to let them win.
Tags: iss134 | azule | finance | Peter Savage 2
Contributing Author Peter Savage 2 Click to read or download PDF
An Obituary to Timecode
Bruce Devlin - new A stoic and persistent character that stubbornly refused to change with the times, Timecode has finally passed on, but no-one has noticed. A long-lasting industry veteran, Timecode was brought into this world at an uncertain date in the late 1960s due to the needs of analogue tape workflows and the demand for synchronisation between audio and video devices. A joint activity between SMPTE and the EBU led to the work on Time and Control codes starting its journey to standardisation in the early 1970s.
Tags: iss134 | timecode | smpte | ebu | edit | Bruce Devlin - new
Contributing Author Bruce Devlin - new Click to read
The Wireless Way to 4k
JP Delport DTC’s AEON group of products have been specifically designed for the 4K market. We encode with the more efficient HEVC algorithm, which means we are taking a 12G signal and compressing it to a bitrate that can be managed over an RF link. So what makes this a leading idea in the 4K revolution?
Tags: iss134 | wireless | 4k | transmission | JP Delport
Contributing Author JP Delport Click to read or download PDF
GoPro HERO 7 Review
Tim Bearder When I heard I was filming a nature restoration project in the pouring rain this week I was excited. WHY? No Cameraman enjoys the rain, surely but this time I was enthusiastic because I knew this would be the perfect opportunity to try out the brand new GoPro Hero 7 Black Edition.
Tags: iss134 | gopro | hero 7 | review | liberal media | Tim Bearder
Contributing Author Tim Bearder Click to read or download PDF
Shedding Light on the Blackmagic Pocket Cinema Camera 4k BMCPP4K
Garth de Bruno Austin “What is it about light that has us craving it?” Is the question asked in the opening seconds of Garth de Bruno Austin’s latest short, The Colour of Light. Exploring this natural, human need as well as our innate desire to control it, Garth’s film showcases everyday people going about their lives in differing degrees of luminance, whether that be an artificial streetlight or a natural morning sunrise.
Tags: iss134 | blackmagic | cinema camera | 4k | cpp4k | Garth de Bruno Austin
Contributing Author Garth de Bruno Austin Click to read or download PDF