It’s no surprise that such a fundamental sonic breakthrough as TiMax, invented by UK innovator Out Board, should appear in so many disparate applications.
A brief glance at the portfolio takes you on a dizzying journey from opera at The Royal Albert Hall to enveloping electronica at superclubs Fabric and Matter; from the shores of a Swiss lake, where they like to hear the music of Jesus Christ Superstar walking on the water, to the South Bank of the River Thames where spatialised audio mingles with the fog; and from theatre in-the-round at The Young Vic to the Edinburgh Military Tattoo. Out Board is out there.
To localise audio convincingly is to overcome the basic limitations of any sound reinforcement system. There are more ears than speakers at most gigs, and not enough audio to share equally – a problem noticed by Out Board founder Robin Whittaker, who turned his company’s mission into solving the inequities of sight and sound. It coincided with the birth of digital pro audio in the UK, which provided the means and the manpower to create TiMax, the world’s first time delay matrix for sound.
Whittaker had quite naturally entered an analogue audio industry, and first became aware that audio could be converted from an analogue phenomenon into a digital medium in 1980.
“I went to an AES convention at Kensington Olympia in London and saw the Neve DSP-1 recording console,” he remembers. “It was basically a television screen and a rack of computers, and the company was demonstrating a one-input, one-output mixer that promised to alter the future radically. That same year, I think, the Trident Diane was launched, probably the first entirely digitally controlled analogue desk. This was prior to Sony DASH and the other digital tape machines, which also made an impact.”
Whittaker had just left university and had spent some time working at Burns Guitars, then with Francis Williams and Technicord making compact analogue mixers. The digital thing percolated gradually… “It was the slowest overnight revolution that ever hit the world!” he says. “It took decades for digital audio to become a workable reality beyond those early innovations. I spent several years designing mixing consoles for Dynamix Audio and later freelancing in recording studio maintenance, contract design work and so on. With one colleague I hatched a plan to make a fader automation system, which meant a leap into digitally controlled analogue circuitry with servomotors to record and replay fader movements. That was the original impetus to form Out Board Electronics.
“I reckon we were first on the block with a bespoke automation system for live sound, which we retrofitted to various consoles used on some big tours – Phil Collins and Rod Stewart included, using Midas or Soundcraft. We also made some small, standalone fader automation boxes with digitally controlled analogue routing matrices, and these brought us into contact with Autograph Sound: the first production of Miss Saigon at Drury Lane in 1987. Andrew Bruce wanted to be able to dynamically reinforce the sound of a helicopter flying around the space, and our matrix solved that.”
Digital audio became the main focus almost a decade later, with the first TiMax.
“I became aware that if you want to move sound in space, it will always work against you if you don’t control sound in the time domain. You have to master the power of the Precedence Effect. When I first read the Helmut Haas paper on precedence in the AES Journal, during some downtime in this studio in London where I was freelancing, I thought I’d re-create the Haas experiment: I took a digital delay line and patched it into one side of the studio monitoring… and was absolutely gobsmacked at the result. With just a millisecond of time delay in one half of the mix, I couldn’t hear the other half.
“It was the first part of a long ‘eureka!’ moment. Some years later we’d made a sound effect matrix for Autograph using the moving faders, on a show called City Of Angels, and there was a moment when an a cappella vocal group appears from behind some curtains. Watching this, my eyes told my ears to expect the vocals where they stood, but as the curtain went up what I heard came from completely the other side of the stage. The dislocation was overwhelming. It was at that moment that I thought if I could combine this Haas Effect imaging phenomenon with vocal reinforcement in a multi-speaker system, I could fix that localisation problem. That was the rest of the eureka moment.”
One round of Department of Trade & Industry funding later, and a new R&D team was gathered in a small Cambridgeshire HQ.
“I knew a couple of people in the digital audio game, especially among the Neve developers who’d worked on the Capricorn digital console or the AudioFile DAW,” continues Whittaker. “They were based near Cambridge, where I was, so when AMS took over Neve and moved it to Lancashire there was a fair amount of talent in my neighbourhood. Plus, after the Yamaha 01V was launched it became obvious that all the major console manufacturers were going to make their own fader automation systems, and the retrofit business was tough anyway – the first thing you do, of course, is invalidate the warranty! Out Board had to go in this direction.”
The team worked with the first iteration of Analog Devices’ SHARC processor, and after a few teething troubles, a chance conversation with a certain fellow University alumnus provided a breakthrough. “I was at college with John Stadius,” Whittaker explains, “and I knew he was using SHARCs on the Soundtracs DPC-1 console. He told me about Analog Devices’ ‘Exceptions List’: a well-hidden document outlining the processor’s limitations and how to get round them! Once we had that information things began to fall into place, and we set about writing some front-end control software.”
That wasn’t easy, either. Whittaker poses the question: “If you’ve got a 642 digital audio matrix with time delay at every cross-point, how on Earth do you use it? That was where the out-of-the-box, blue-sky thinking came in. It was obvious there would have to be a GUI, rather than a hardware control surface. It was too ethereal for any traditional pro audio interface. A brilliant young software designer called Chris Royle helped us knock Version 1 into shape.”
There was no such thing as a delay matrix in those days, and therefore no consensus on how you might drive or programme such a product. Clearly, the success and popularity of TiMax is proof of Out Board’s good judgment – but the future is not over yet. “I often think,” reflects Whittaker, “that this type of never-done-before project needs a lot of bold experimentation, rather than a detailed specification beforehand. It takes years to perfect, and it carries on to this day.
“A lot of the ideas we had in TiMax 1 are still being honed, and form an important framework for TiMax 2. Over the past 12 months a couple of major players in sound reinforcement have entered this market – or at least what is often called the surround or spatialisation market – and they place all the control inside the algorithm. We prefer to allow human flexibility over the last dB and the last millisecond, because that’s where the difference is made in localisation and immersion.”
Read the digital edition of Genius!4 here.