News Business
Feature

The rise of 3D Audio

Phil Ward 19 April 2017
The rise of 3D Audio

The entertainment industry and the media, if we can separate those two addictions of modern life, are about to be hit by changes in audio production and presentation that could dwarf the evolution from mono to 7.1. On several fronts, three-dimensional sound reproduction is making rapid headway and it seems nowhere will remain untouched by some kind of adoption: from the sound reinforcement used in customer experiences and attractions to regular theatre; from conferencing to fine art; and from radio to online TV.

The gaming industry’s dramatic effect on social habits is beginning to bite into mainstream forms of communication, and it’s the headphone generation that leads the way. Smartphones have made media and entertainment an instantaneous and constant companion, so ubiquitous that formal consumption of leisure and information has to raise the bar – and 360° audio reproduction will play a big part in that at every level.

SOHO EAST

In London at least, some studios and agencies have already been established in the pursuit of these new markets – including 1.618 Digital and Visualise. Others have evolved from traditional post-production, maybe even adjusting the company name to highlight the changes: witness Bamsound VR, which used to be just ‘Bamsound’. “We’ve been around for 10 years doing conventional post-production,” reveals Scott Marshall, founder and CEO (pictured), “but in the last year and a half we’ve re-positioned ourselves into this market. We did it partly to create the buzz, of course, but also because this is the new audio. Our existing clients are very excited about it. Slowly but surely, over the next few years, more and more people will adopt these techniques.”

As well as new agencies coming into being, traditional post-houses can adapt to immersive sound quite easily, Marshall believes. “I get a lot of enquiries from other facilities who are interested in VR and other applications, as we are ahead of the curve. But we’re right on the cusp, now, of a new industry. And we need that, to grow the market. As the right clients – the right brands – embrace it, it will gather pace.”

Just as DiGiCo needed more competition to expand the digital live console industry in the early years – which duly arrived – facilities like Bamsound VR will welcome more players to the field if it means greater awareness of the possibilities. “At BVE in London last month I met many contemporaries in mixing and dubbing who wanted to know more about how it works,” adds Marshall.

How does it work? There are several current platforms that, each in its own way, apply algorithms to the signal process to create more convincing three-dimensional audio. Sennheiser’s AMBEO, Out Board’s TiMax and L-Group’s L-ISA project are just three examples, but their applications have so far concentrated on advanced forms of sound reinforcement. In cinema and home theatre, Dolby Atmos and DTS:X have taken the market beyond 5.1 and 7.1 and their rapidly antiquating techniques, and are now entering a new phase of production and even pre-production relationships.

But it’s Ambisonics that seems to be getting a foothold in VR production, which is odd considering that it emerged from British academic research in the 1970s. A dedicated community of enthusiasts has nurtured those principles over the years and has gradually updated them for applications in the digital age: notably Blue Ripple Sound’s O3A suite of Ambisonic plug-ins for VST platforms following its development of ‘Higher Order Ambisonics’ already popular in gaming and VR. So far the company has achieved compatibility with the DAWs Reaper, Pyramix, Max.MSP and [Canadian software] Plogue Bidule, but not yet Pro Tools, Cubase or Nuendo.

BACK TO WORKFLOW

This doesn’t stop the new generation of 3D sound designers, who are happy to switch between DAWs according to each stage of production and post-production, playing to each one’s strengths. Henrik Opperman is head of sound at Visualise, an up-and-coming VR studio and production company based – like 1.618 Digital – in East London. “I work in Ambisonics, which is finally finding an applications-driven market,” he says. “Its workflow was integrated into DAW-world, and there are the Ripple plug-ins, but otherwise it’s much the same. The Sennheiser AMBEO VR mic that I use doesn’t have the old matrix output that the original Ambisonics mic had, so the signal goes straight into a DAW and into any format – complete flexibility.

“I do use Reaper a lot, because my workflow is in 3rd-Order Ambisonics, but whenever I need to work to picture I can just as easily convert to Pro Tools or Logic. But we need to move on from the tape-emulation concept, and adopt techniques closer to Reaper’s object-and-multichannel model. One track can have MIDI, audio, Broadcast WAV or MP3, and it’ll just play it.”

“The market leaders are releasing the tools free of charge,” adds Scott Marshall (pictured), “similar to the way visual effects and cross-platform game engine developers do, like Unreal and Unity. Facebook has released its panning tool, compatible with Pro Tools, Reaper and Nuendo, which opens it up for everyone to get involved. We do need post-production experience, though: I run a post course at the School of Sound Recording in Camden and the students all come from a musical background. Working with film and TV, I always tell them, is a different language – and it will be the same in VR and other 360° genres. It’s not a music mindset. A lot of the ground rules for creating a great audio landscape should be carried forward into 360.”

LOCATION, LOCATION

Facilities will also have to navigate the usual market forces as one generation lapses into another, as Marshall points out. “Pro Tools is still the professional tool that provides the audio quality and consistency we require, even though DAWs like Reaper are really useful. Our jobs are sent all over the world, and compatibility with Pro Tools is essential. It may be a little behind the curve with its channel output, although there’s more on the way, while the newest version of Nuendo can implement a lot of game engine stuff, but people still trust in the fidelity of Pro Tools above anything else. It will get there.

“Meanwhile I’m beta-testing the Dolby Atmos authoring tools and helping them build new products for VR, and that’s only reached Pro Tools so far with the option of 128 objects. It’ll be a while before it gets implemented on other DAW platforms – which I think says a lot about market confidence in Pro Tools. Other systems will catch up with Higher Order Ambisonics in time, but at the moment Reaper is the main choice if you’re already heading down that path.”

How does this affect monitoring? Are clients expected to come into a facility, strap on headsets and float off into another green world for a while? “It depends on the client,” says Marshall. “I’ve done a few reviews where we’ve done a Dolby Atmos mix, but obviously it’s not true VR, 360 because you can’t experience the spatial audio individually. We can dynamically mix if everyone’s on headphones, to excite the programme – or we can send them the files depending on the format.

“That’s where the final market is anyway – on headphones. It’s not for large-format screening, although it’s interesting that Dolby’s VR tools are very similar to the Atmos tools. I think there’s a roadmap to being able to convert large-format cinema mixes into VR…”

Ambisonics is also inspiring a new wave of binaural recording, with engineers setting out on location to find good content ready for a new generation of audio post-production. Henrik Opperman (pictured) travels the world just like David Attenborough’s sound guy, but this time it’s Ambisonic. “We do a lot of 360 videos as well as VR,” he reports, “but whether the target is channel-based, for video, or object-based, for VR, it’s the same source capture: the AMBEO mic (pictured top) does it all. Next destination is Bermuda…”

“There’s a re-connection with production audio,” agrees Marshall, “which was broken down over the years in traditional post. Again, there were plenty of sound recordists at BVE and I’m now getting recordists involved in the whole process right through to post. They’re a new generation, very technology-savvy, who can take those free tools and SDKs and contribute hugely to new designs and products. There’s a lot of different things happening at the same time, including right now the building of new Ambeo and other spatial sound libraries.”

USEFUL OBJECTS

Object-based audio replaces discrete channels with moveable – or at least adjustable –spot sources, anything from an individual actor’s voice to a footstep. If these are decoded in suitable playback systems they can generate a soundscape that builds an illusion of three-dimensional space, although 3D is not the only type of application.

Chris Baume (pictured) is a senior research engineer at the BBC’s R&D Department in London. “We’re looking at new experiences that we can provide for the audience, and we’ve grouped what we can deliver using object-based audio into four categories: Immersive is one of them, which covers 3D sound, but there’s also Interactive, Personalised and Accessible.”

‘Interactive’ includes a project called Venue Explorer: a web app allowing users to ‘direct’ their own images and sounds from a live feed. ‘Personalised compression’, for example, offers automatic adjustment of dynamic range compression in your headphones as the mic in the smart device detects a passing truck, or similar – rendering your mobile media a little more resilient to mobile acoustic conditions.

The ‘Accessible’ experience gives listeners the means to customise basic mixes, such as the balance between sports commentary and crowd noise or between dialogue and M&E. “It’s quite a simple change, but is the starting point for many potential applications,” adds Baume.

Binaural output is at the heart of the BBC’s Immersive experiments, used to spatialise audio content in many different ways. Something called the Orpheus project (pictured) has been set up in-house to develop the tools and protocols to do this more effectively, Baume explains. “This is where we’re building the infrastructure we need to produce and deliver object-based audio to the home, and although it doesn’t have to mean a complete overhaul of how a broadcast studio works there will be improvements. New pre-production tools will include a lot more metadata, for instance, so production can track and use the objects meaningfully.”

DIMENSION, THE WAR

All of this chimes with the new priorities of interactivity, audience power and an increasingly ‘unmediated’ media. But it’s not all young turks with Wi-Fi: the legacy boffins are onto it. The Fraunhöfer Institute in Germany is now working on MPEG-H, Part 3 of which is concerned with securing some interoperability standards for 3D audio; while the Audio Engineering Society’s Technical Council (AESTC) has just convened a new board of experts to address what it calls New Realities: the development of audio in ‘Virtual, Augmented and Mixed Reality environments’.

Even if VR is asking for a big lifestyle change – Nintendo Wii meets sensory deprivation – its smaller sibling Augmented Reality will crop up everywhere: your regular senses are still in touch with ‘consensus’ reality but helped along by prompts, cues and enhancements that understand where you are. Think SatNav on steroids.

So give the Smartphone kids a break. They are driving our future.

www.visualise.com

www.bamsound.com

www.1618digital.com

www.bbc.co.uk/rd

sennheiser.com

Similar stories