VMI Blog V: A Metahuman Experiment

Part of the brief we developed for our Virtual Macquarie Island project involves the deployment of in-world avatars of the scientists we interviewed. We’ve been very fortunate to interview nine specialists with many decades of on-island experience between them, covering a wide range of disciplines. What is striking about the individuals we interviewed is not only the depth of their knowledge of the island and their discipline-specific knowledge, but that they are also broadly and deeply aware of cross-disciplinary studies and interactions across the sciences and humanities, in their understanding of the natural history of the island. They have some great stories to tell and real enthusiasm for the science of this remarkable place. It’s really inspirational stuff.

We are very grateful for their generosity with their time and knowledge – and for the time they spent interviewing with us – and for the terrific preparatory work undertaken with our ace-interviewer Zoe Keen. This enabled us to cover a lot of ground in the interviews, in a relaxed and straight-forward documentary style.

But how to translate this into an interactive VR-style presentation? The last thing you want is green-screened talking heads, composited live-video in a CG environment or Voice-of-God-only narration.

I decided to conduct an experiment using Epic’s Metahuman technology. The (current) Metahuman interface provides a palette of pre-defined Metahumans that broadly map-out a kind of physiognomic terrain, from which one can select a primary facial model, and iteratively hybridize it with other models in order to explore and constrain the parametric space of facial modelling and modification. This means that one can mix-and-match facial and bodily characteristics to generate unique characters – and make an attempt to replicate real individuals. But it can still be very labour-intensive to create believable replicas of individuals – and this is not the point of the exercise. My aim has been to create broadly-believable avatars that capture and express some of the characteristics of our interviewees, and to approach Metahumans in a fairly simple way that doesn’t require extensive modification, lingers somewhere outside the uncanny valley, and isn’t too inadvertently amusing or cringeworthy.

So I turned it into a bit of a game: I showed our scientists the Metahuman interface and broadly described how it worked – and then asked them to choose three ‘parents’ from a numbered set of screenshots of the standard faces. All they had to do was provide me a list of three numbers, with any additional preferences and observations they felt like making. Scientists are busy people, so you don’t want to waste their time. This gave me liberty to include an extra ‘parent’ that I thought captured something about them (the tilt of an eyebrow, the shape of the jaw etc.) and sufficient artistic license to create believable, not-too-cosmetically-enhanced characters, within the constraints of the current palette. Of course, I have an audience in mind – and thankfully scientists are generally not a vain lot – they’ve embraced this process with goodwill and good humour. I’m pretty happy with the results so far – let’s see what they think! I’m happy to make everyone happy with their alter-egos on their Metaverse island.

Of course, there will be a bit of tweaking to do, getting clothing right, making sure we can handle some complex hair grooms or whether to hide them under a nice warm beanie or hat – as befits the sub-Antarctic environment. That might make life a lot easier – as always, I am conscious that often in animation, less is more.

The next steps for our Metahuman Scientists is to breathe life into them – this will make a dramatic difference – they might even look happy! More on this in the next post.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.