Syn[a]+ : Project Outline

Syn[a] is a group of creative people (artists/inventors/engineers/scientists) developing and experimenting with systems to visualize and sonify the biometric data of an orchestral performance over ultra-high-bandwidth networks. We have lots of experience, lots of ideas and will make new things.

Needless to say, this page is in progress.

See: here on Google Plus

The core team is:

-Dr Peter Morse, Director, Visualisation & Project Management (Hobart, TAS). Peter Morse has an extensive background in ultra-high definition and stereoscopic immersive visualisation and augmented reality systems. He is currently Digital Consultant with the TSO. See: http://www.petermorse.com.au

– Dr Stephen Barrass (Canberra, ACT). Audio Interfaces, Software Development. Stephen Barrass is an Associate Professor of Digital Media at the Faculty of Arts and Design at the University of Canberra. He has previously worked as a research scientist at the Advanced Audio Interfaces group at the CSIRO ICT Centre, Canberra.

– Michela Ledwidge, Artist (Sydney, NSW). She is a director of creative studio MOD Productions and media cloud service Rack&Pin. She teaches Remixable Media (University of Sydney) and directed numerous interactive projects including work for the Australian Chamber Orchestra, Nintendo, and the BBC. http://michelaledwidge.com

– Tim Barrass, Augmented Reality Interfaces, Software Development (Melbourne, VIC). Tim Barrass has a Masters Degree in Digital Media, including software development for emergent and complex systems. He has extensive experience in AR programming and performance. He is an active musical performer and composer.

– Rodney Berry, Augmented Reality Interfaces, Software Development (Hobart, TAS). Currently a geek in residence at Salamanca Arts Centre, Hobart. Previously research fellow at the National University of Singapore, Faculty of Arts and Social Sciences’ Communication and New Media Programme. 1999-2006 Research scientist at ATR Media Information Science Laboratories, Kyoto.

– Paul Bourke, Visualisation Interfaces, Software Development (Perth, WA). Paul Bourke is Research Associate Professor and Director of the Western Australian Supercomputer Program, a facility of iVEC at the University of Western Australia. He is a world authority on visualisation techniques and has worked extensively in the visualisation of brain dynamics.

– Brett Rosolen, Technical Manager, AARNet

 

What

  • We are developing a system to visualise and sonify the biometric (eg. EEG, ECG, temperature), spatial and audience interaction data collected during an orchestral performance. The system will be re-deployable for future concerts and be context/content agnostic.

How

  • a team of artists/researchers who have been working in the field of visualisation, advanced audio interfaces and augmented reality systems will develop the system. The system builds upon their background of research in this area. The system will utilise a variety of data sensors (e.g. Enobio wireless EEG/ECG, emotiv EEG, thermal imaging cameras, spatial image processing, location aware AR) to generate data streams that will be processed via custom software, visualised using sophisticated graphic algorithms, digitally projected and distributed over mobile platforms for audience interaction (eg. iphone, android, ipod touch). Of interest is high-speed wireless networking and potential for development of artistic works for the NBN.

 

Documentation

Work in Progress:

Funded by the Australia Council for the Arts via the 2011 Digital Culture Fund.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.