SCARI – Kinect – Unity3D – OSX- Skeleton Tracking
A couple of experiments as I figure out how to get SCARI working.
Basically, this involved getting the brilliant Kinect Unity3D OpenNI wrapper demo by Amir working on my Mac Pro.
It was very straight-forward, following his instructions (thanks Amir!) – just need OpenNI+NITE (simple-openni for OSX is here), along with a few things like macports & a bit of command-line faffing around etc.
So, I know that works.
Next, an example of what I’d like to do with this: using the Kinect to control a first-person controller in Unity3D to walk through an architectural model of the Salamanca Arts Centre (in fact, any 3D model).
Unity3D is a remarkable free game engine (with paid-for pro versions) that compiles across Windows, Mac OSX, iOS and Android platforms. So, the opportunities are obvious. I think of it not so much as a ‘game engine’ but a ‘visualization engine.’
So, time for some more tinkering – but the basic message is that this is going to work! (famous last words…but really, it’s a matter of time, effort and knowledge – and collaboration.)
Also of interest is the idea of using Paul Bourke’s dome-render technique for Unity, so this could become a way of interactively exploring material projected in a dome environment (like iDome or Fulldome) – that would be very cool.
It would probably be useful – certainly interesting – to be able to render this stereoscopically, so that a user established a more coherent sense of the spatial relationship between the proxemic/kinetic space of their own body and immediate environment – and it’s representation in the virtual environment. Of course, practice would avail this depth correlation (just as it does with the Kinect and an XBox and a TV), but I wonder whether stereoscopic depth-cues would speed this process up? In the dome there are other types of peripheral-vision depth-cues that could presumably also be effective, obviating the need for stereopsis.
An interesting experiment to run for this will be to start up in Windows7 under Bootcamp, recompile all these Unity demos & libraries, and then see if I can drive my various stereoscopic monitors using the Nvidia 3D toolkit – I’ve had this working on my Mac Pro for a while now – for my Alienware 120Hz stereo desktop monitor and Nvidia stereo3D glasses (active stereo.) BUT, is it then possible to drive the signal over an HDMI 1.4n cable to my Panasonic TH-P50VT20A 50″ stereoscopic 3D plasma monitor? (cnet review) Via the Apple adaptors? Probably not at this stage, but there’s no harm in trying.
Hmm. Much more fun to come.
Just wondering how you’ve made out with this since March. I’m also working with Paul Bourke’s fisheye lens in Unity, and with Kinect (via Zigfu’s installer and demos). Still putting the pieces together. Watching your posts on G+. Cheers.
It’s currently on the back-burner, due to my failure to secure any funding for the work – but it’s something I hope to revisit during 2012.