Syn[a]: Visualizing Biometric Data from a Musical Performance
In December 2011 the Syn[a] Group, in concert with AARnet and the TSO, ran a 5 day workshop at the University of Tasmania Conservatorium of Music, where we visualised the biometric data of musical performers and transmitted this over high-bandwidth networks (AARNet) in stereoscopic 3D, creating an immersive augmented telepresence environment.
AARNET News featured an overview of the project:
“In a trial conducted between the Tasmanian Symphony Orchestra, the University of Tasmania’s Conservatorium of Music in Hobart and AARNet in Sydney, the foundations for recreating an immersive musical performance experience were demonstrated.
During January 2012 Musicians from the Tasmanian Symphony Orchestra played a variety of pieces that were captured in High Definition 2D and stereoscopic 3D in Hobart and broadcast live across the AARNet network to Sydney.
Several high quality video streams were simultaneously broadcast to a variety of devices, including an off-the-shelf 3D TV, and a multi-channel audio field was captured and recreated to reinforce the spatial arrangement between the musicians. Additional biometric data was captured in real-time using watch accelerometers and commercial E.M.G. headsets to build visualisations that were composited into augmented broadcast vision.
Overall 4 x Full HD resolution (equivalent to 3840 x 2160, or 8 Megapixels) was transmitted and the data rate reached approximately 800 Mbps – the resulting total rate from the University of Tasmania extended well over 1Gbps, made possible only via the recently built 10 Gbps Basslink circuit.
“Our research demonstrated the viability of next-generation high-bandwidth networks to deliver augmented orchestral performance for immersive environments over AARNet and NBN-equivalent infrastructure” said Project Director, Dr Peter Morse. “This will lead to innovations in music pedagogy and performance for the networked age – bringing orchestral performance to new audiences in exciting and engaging ways that we are only now beginning to explore.”
The trial has resulted in submissions for further funding to a variety of bodies to continue research into remote and augmented performance, including the impacts of using significant bandwidth to reduce latency for simultaneous performance.”
A public google plus album of photos can be seen here:
The results of this research formed the basis of a performance at the MONA/TSO Synaesthesia Event in 2012:
Review in AussieTheatre.com.au :
Apparently synaesthetic composers Scriabin and Rimsky-Korsakoff fought constantly over their “definition” of the F sharp chord: one experienced it as violet and the other as orange. This story was one of many told by synaeasthetic musician Andrew Legg who, with a group of artist-technologists, performed three keyboard improvisations while his body and brain were wired up to computers. Projections of real-time digital imaging of Legg’s vital signs ranged from interesting but prosaic colourful graphs to a beautiful, trippy, multicoloured lava-lamp-like animation. Legg’s work Syn[a]: Clavier a Lumiere, was the closest we came all weekend to experiencing the synaesthete’s inner eye.
Review in The Australian
Article in the Hobart Mercury