AR: AVIARy

Audio Visual Interactive Augmented Reality (AVIARy)

Sonification using Augmented Reality techniques.

An overview of the the Ozviz 2004 presentation and system specification can be found here: AVIARy: Morse, Barrass, Barrass, Adcock, Jacob (2004) : 

AVIARy_Final_Ozviz2004_web

Video:

What you see is a stage with four “consumers” sitting around a dinner table (bottom left). The table has a camera suspended above it, of which the video is projected onto a large screen, so the audience simultaneously has a top-down view. The performers are served “dishes’ of AR fiducial markers, upon which images of the audience and various body parts are composited, driven by the AVIARy system (based upon ARToolkit), The ARToolkit uses pattern recognition algorithms to enable the recognition and tracking of fiducial markers (eg. cards with unique patterns upon them) by a camera attached to a computer, assigning 3D cartesian co-ordinates to the marker and then compositing computer-generated content – in real-time – upon the marker. This enables new types of interaction with virtual content, beyond the conventional mouse/keyboard/joystick-type interfaces – and in the case of AVIARy was extended to fully spatialised audio driving an OSC audio system.

Edible Audience by The Consumers (2005)

This performance arose as the consequence of the development of AVIARy – Audio Visual Interactive Augmented Reality – a system developed in collaboration between the University of Melbourne (Peter Morse, Tim Barrass) and the CSIRO Advanced Audio Interfaces Group (Stephen Barrass, Matt Adcock.) It utilises the JARToolkit developed by the HITLab, and was an experiment in audio augmented reality.

In concert with sound-artist Alistair Riddell and performers Anita Fitton (ANU) and Caitlin Wall, we staged this event for the Liquid Architecture 6 Sound Festival at the National Gallery of Australia, Canberra.

The blurb went:
“To tantalise all senses, Liquid Architecture also presents ‘Edible Audience’ by The Consumers. Four guests dine on auditory aperitifs, feasting on the faces of the audience to produce a gastronomic sound-scape. This interactive sound art interface was programmed by Tim Barrass, and is performed by Stephen Barrass, Anita Fitton and ‘Onaclov’ [apparently ‘Volcano,’ backwards]  from the University of Canberra, and Alistair Riddell from the Australian National University. The Augmented Reality System (AVIARY) used to track the food was developed in collaboration with Peter Morse from the University of Melbourne.”

As the video demonstrates the system was overly sensitive to illumination/shadow and triggered many uncontrolled/undesired sound-events – we had very little time to finesse the performance context, but the basic principle is there. Other interesting ideas we didn’t have time to properly explore included such things as using vegemite-on-toast as our fiducial markers – this would have been fun and rendered them actually edible (by Australians at least.)

Needless to say, a bit more work on the system and environment would facilitate a far more controllable performance event.

AVIARy Development:

Example video of AVIARy in development, 2004 (skip through for commentary):

This video from 2004 is an early demonstration of the AVIARy system: Augmented Reality sonification using the JARToolkit, developed by Tim Barrass, Stephen Barrass, Tom Jacob, Matt Adcock, Peter Morse. A project from the University of Melbourne and the CSIRO Advanced Audio Interfaces Group (CSIRO) 2004. Based upon the Beehive system developed by the CSIRO.

It’s pretty rough and here mainly as a demonstrator for work that could be further developed – the technology is now far more capable than shown in this early demo.

In this compilation video we see AR interaction with sound networks, FM synthesis and modulation, artefact swarming and generative synthesis – spatialized using AR fiduciary markers and human interaction.

You may also like...