BBC’s Sound: Now and Next event was, according to Frank Melchior, lead audio technologist with BBC R&D and the driving force behind the event, designed to get programme makers, researchers and technologists together to look at what was being done today and how that could move on in the near future (see Now and Next for the BBC).
Now and Next was presented by LJ Rich from BBC News Channel’s Click technology programme, who said the event would show “the nexus between sound and technology”. This was confirmed by the showcasing of what Melchior called “almost four years” of the BBC Audio Research Partnership with the Universities of Surrey, Salford, Southampton, York and Queen Mary University of London, plus displays by DTS, Fairlight, Fraunhofer, Blue Ripple Sound and Dolby.
Spatial sound in its various forms – Ambisonics, object-based and binaural – was a pervading theme, as broadcasters, engineers and producers contemplate the best way to recreate how the world sounds for ultra-HD TV, games and virtual reality. Wildlife sound recordist Chris Watson addressed this in A Journey South, with his recordings and audio diary from trips to the Antarctic making natural history programmes.
He detailed his work with adapted hydrophones to record inside glaciers and SoundField microphones for surround. While saying that immersive technologies put the viewer/listener “in a sense of space”, Watson bemoaned that his efforts to do this were often obscured by having “orchestral music smeared over it”.
Current broadcast transmission systems constrain new immersive audio production, as do the requirements of event coverage. In a session on live broadcast sound freelance OB sound supervisor Bill Whiston (pictured right) and Olympic Broadcast Services audio manager Nuno Duarte both commented that although tests were being made it, could be some time before spatial technologies were a regular component.
Andy Rogers, senior producer for live music with BBC Radio 1 and 1Xtra, commented that there was “a problem in doing immersive”, even though programmes had been made in both 5.1 and binaural in the past. “For a long time we did 5.1 in our trucks but there isn’t a platform for it apart from HDTV,” he said. He added that R1 presenter Rob da Bank had fronted a two-hour binaural special in 2014 and while that was well received “the difficult thing is explaining to listeners that they have to put their headphones on and keep them on”.
In a specific session on immersive sound Isabel Platthaus, commissioning editor and dramaturg with German public broadcaster WDR, outlined the production processes behind 39, which is both a radio play and an interactive game for mobiles. Working with sound engineer and game designer Achim Fell, who co-presented the talk, Platthaus says the production was built up in layers so it could be both listened to and played with. Fell added that the sounds were mixed together, with the interactivity allowing the level of immersion to be increased.
The potential of immersive sound for virtual and augmented reality was discussed by Varun Nair, co-founder of interactive audio specialist (and Pro Sound Awards finalist) Two Big Ears. Nair said that the three main aspects for 3D sound in VR/AR are panning, elevation (playing with height) and externalisation, for the sense of reality. “But technology is a means to an end,” he concluded. “3D audio gives more tools to tell stories.”
Earlier in the session Martyn Harries, re-recording mixer and senior lecturer in audio and music technology at the University of the West of England, observed that “object-based surround is way forward”. This was discussed on the second day under the heading of Responsive and Interactive Content. Matthew Brooks, senior engineer with BBC R&D described how object-based technology was used to create a version of radio documentary The Cornish Gardener that allowed listeners to specify its duration (see The real request show: BBC R&D trials Responsive Radio). (Pictured above right is BBC senior research technoloist Jon Tutcher.)
While that had been a re-versioning of an existing linear production, BR-Klassik in Germany has created a responsive radio feature about the beginnings of World War I from scratch. Managing editor Werner Bleisteiner explained that, working with the IRT and BBC R&D under an EBU pilot project, he and his colleagues had assembled a series of clips and music tracks as objects to create what he described as “radio beyond radio”.
Different production tools that can make the creation of responsive and immersive audio material easier and more efficient were the subjects of the final session of Now and Next. Musician Tim Exile shook up the venerable Radio Theatre with his Flow Machine, a looping system that combines MIDI, sampling and delay technologies. Mark Boas, co-founder of Hyperaudio, described the technology behind his company’s name, which is designed to integrate audio into the web in the same way hypertext does for words.
Perhaps most intriguing was the editing and search software conceived by Professor Jörn Loviscach of Bielefeld University of Applied Sciences. Originally designed so that he could easily search videos of his lectures, the program has potential for general audio production in giving visualisation of selected words and the automatic highlighting of “um”.
The possible future of audio in all its forms was addressed by composer and sound artist Nick Ryan, who said the key elements were immersion, interactivity and sonification. He explained the third component as “transforming information into sound”, be it data or from another sense, as in the case of colour.
As newly appointed controller of BBC R&D Andy Conroy remarked at the start of the event, the intention was to identify techniques and technologies that would “feed the imagination” of audiences. “Someday all broadcast audio will be made this way,” he said hopefully.