Real-Time Cosmological Simulation Visualization for VR

I’m very taken by the possibilities of Voluminous – a real-time volumetric visualization system developed by Drew Whitehouse of the ANU NCI Vizlab in 2011 – I’ve used it myself for a few demos for what-is-possible, as it is so exemplary. It’s a kind of web-front end for Drishti. It occurred to me that it would be interesting to explore its use in visualizing large scale cosmological simulations – in the manner we did in Dark.

(Update 2021 – Voluminous is no longer available online, so I’ve removed any live links)

Voluminous renders large volumetric datasets in the web-browser, using WebGL. But how can this be done, given that it’s a big challenge to send large datasets over narrow data pipes (e.g. home systems, mobile devices, wifi) and many GPUs are not capable of loading let alone rendering such big datasets? The answer is using low-res sampled proxies – and thanks to some clever architecting, replacing a selected view with a streamed-in high-res render.

Drew sent me a system diagram roughly indicating how this works:

Voluminous System diagram. ©2013 Drew Whitehouse NCI Vizlab

Basically the idea would be to use the Voluminous back end to explore some very large astrophysical data sets (HDF5 files) – like the ones we used in Dark, but with some additional features built in for stereoscopic rendering and VR display.

The guts of the idea is doing Dark (or something similar) in near-real-time in immersive VR with both VR headsets & 4k dome screens, simultaneously. Easier said than done – and omitting all the movie stuff – it’s the data viz I’m interested in at this point.

What I’m imagining is similar to the current use of Voluminous, but using Unity3D and Oculus Rift (or other VR prototype) as the graphics engine/viewer (i.e. full OpenGL rather than WebGL, multi-platform support, creating “apps” rather than relying on web browsers, immersive stereo etc.)

SO – for instance –

1] you would have an HDF5 astrophysical dataset hosted somewhere in a database, about which you know quite a few things – via metadata

2] A query-able low-res proxy is generated from that (necessary if it is hundreds of megabytes/gigabytes in size) – consult with astrophysicists how to, what’s important etc

2] via some GUI, e.g. in Unity, one defines a region of interest defining the query – “lets look at this bit” (i.e. voxel selection GUI)

3] this is viewed via a VR headset in stereo3d – or on a screen, it doesn’t matter (but I like the immersion aspect of VR + e.g. the LEAP controller – which hasn’t been used much with science datasets to great effect, but could be)

4] query sent to GPU farm & system

5] the data is subsampled (e.g. via THREDDS/OpeNDAP or some other query-parsing mechanism) and the geometry selected is sent to Unity (for low rez glowy preview) (maybe this step is unnecessary – but it’s a LOD issue & I don’t know if OpeNDAP or THREDDS is up to it, but it might be)

6] At a user defined point a stereoscopic SPHERICAL panoramic image is rendered at high rez and the L/R views sent to the L/R textures in Unity (presumably a sphere, with appropriate scripting to reveal only the Left view to Left camera, Right view to Right) – and this is delivered to the Unity interface., for a fixed point high-rez stereo spherical view)

7] Inclusion of social interaction/expert aspects – like Voluminous – that it can be concurrently inhabited & shared and commented upon – e.g.. saving interesting aspects for later computational analysis and processing.

I was inspired not only by Voluminous & Dark but also thinking about this during a recent experiment I did at home the other night but – instead of Antarctica – imagine if I could fly through Dark Matter simulations in realtime?

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.