ORPHEUS

ORPHEUS

Object-based Radio Studio

As part of the ORPHEUS EU-funded collaborative project, we designed and built a new type of radio studio from the ground up — one of the first of its kind in the world. This studio was created to demonstrate the concept of object-based audio, where individual audio sources are kept separate before being combined at the last possible moment, rather then being mixed in the studio. This allows the audio to be personalised to individual listeners, to account for the device they’re listening on, place they’re listening in, or their auditory needs. It also unlocks the ability to create interactive listening experiences.


Credit: BBC R&D

With object-based audio, the sources must go through a rendering process to be mixed, which is controlled by metadata. To create our studio, we created a web application to generate that metadata. This essentially acted as an software-based mixing console that we installed on a large touchscreen. The audio itself was distributed over IP networks, allowing it to be routed between any connected BBC sites.

To test our prototype studio, we commissioned BBC Radio Drama to produce an original interactive drama called The Mermaid’s Tears. At any point in the drama, the audience could switch between any of the three characters to listen their perspective. To test the real-time aspects of the studio, we conducted a special live performance of the drama that was broadcast to web browsers, iPhone apps, and AV receivers. This early work contributed to the BBC’s research on object-based audio, including the Audio Definition Model and EAR Production Suite.

Related links