fbpx

Poster

Poster

Modern Collectibles: Implementing Procedural and Spatial Audio in ARKit with CoreML

This study is an investigation into how immersive audio design for AR can be used to bring objects and the stories behind the ‘Modern Collectibles’ to life in the users’ hands and invoke specific emotional responses as well as greater engagement in the show’s subject and online offerings.

More details

The Discovery Channel partnered with StoryFutures for the ‘Modern Collectibles’ project to produce a TV pilot with an accompanying augmented reality (AR) Apple iOS application. The experience would enable users to explore retro objects from the 1980s, using augmented reality to bring the collectible items –such as a toy ATAT- to life in their own homes. Development was in collaboration with Story Things to use technology to augment the traditional space of linear television programming and execute a strategy of ‘view and do’

An innovative pipeline allowed for research and development opportunities in audio design and implementation. Unity game engine and ARKit was used for an iOS build application to leverage the latest behavioural and detection capabilities in an AR scene.

Computer vision was incorporated (CoreML) to obtain gestural data from users’ interaction with an AR object. Controls of touch, hold and velocity of a users’ hand and finger with an AR object were developed to drive sound design in real-time.

Pure Data (Pd) ‘patches’ were integrated using the embeddable library ‘libpd’. Unity C# scripts combined and communicated with CoreML values and data were parsed through the ‘libpd’ layer to Pd patches containing receiving objects. Interaction, animation and scale of the AR object parametrically drove synthesis in real-time

Early testing indicated that immersive audio design coupled with gestural control does have a positive impact on engagement in the context of AR and contributes to bringing ‘modern collectibles’ to life through invoking nostalgia.

Richard Hemming

BIO

Rich is a Doctoral Researcher in implementing procedural and spatial audio in virtual environments within the Electronic Engineering dept. His PhD focus is on assistive technology for the blind and visually impaired creating a ‘visual-auditory sensory substitution device’ to aid spatial awareness and navigation. He was Head of a Game Audio Ba/MA and is involved in audio design for immersive experiences, pursuing research and development opportunities in commercial and academic projects.

Gallery

Images

BEYOND 2021 partners

Supported by