Amiqus' Simon Pittam talks to studios about the benefits – and misconceptions – about the use of mo-cap in today's games

The Big Question: Does motion capture add depth to characters?

Mo-cap is now a super accessible technology choice for developers, but how much depth can it really bring to games characters?

When mo-cap technology first arrived it had an immediate benefit and efficiencies impacted quickly. Realistic movements that once took a raft of hard-coded parameters could now be captured in one fell swoop.

Initially it was deployed with productivity in mind however in recent years mo-cap has significantly evolved toward a deeper sense of characterisation.

"It’s all about creativity," says Phil Elderfield, product manager at Vicon Motion Systems. "Players want to immerse themselves in a story, and that requires a sense of reality."

At a base level, the function of mo-cap is to recreate movement by replicating natural biomechanics, so in this respect it could be seen to cover a scientific data-capture role within animation. However, biomechanics alone are not compelling without a level of performance. Mo-cap technology has advanced to a point where full body performance is not only very accurate but also provides results extremely quickly so many triple-A games utilise it exclusively, hiring motion editors over traditional animators for a number of roles.

Lead animator at Realtime UK Will Eades describes the evolution of mo-cap’s contribution to character through performance: “The accessibility of mo-cap to studios nowadays allows us to produce content quicker which allows for more iterations creating a more refined and in depth performance of the characters”.

So there has been a shift to toward performance capture rather than just movement capture as a driving force for character.

“Motion capture technology has come a long way since its conception” explains Richard Wearmouth, mo-cap supervisor, “and so have the actors who are now engaging and specialising in ‘performance capture’, giving us an even more believable and realistic game character experience.”

Freelance animator Damon Tasker agrees: “I feel that it’s a widely-held misconception that mo-cap is some kind of magic process that adds depth to games characters. In order to add real depth to game characters, multiple facets of development have to be executed effectively – solid on-set preparation and direction, believable world building, compelling narrative and a considered approach motion editing and implementation, to name but a few. Capturing motion from a real human, doesn’t make it compelling or interesting without purpose or vision.”

Let’s go back a step on why mo-cap has become such a compelling part of the process of character generation. Humans are hard-wired to identify life, specifically living-beings as opposed to objects or plants. Seeing a familiar movement pattern gives us an immediate reference of recognition and believability. Even if it’s fictional, our innate connection to other beings is awakened when confronted with a reality that we already know.

Furthermore, if you generate enough belief you can start to stretch that sense of encounter with another being by manipulating features to create deeper emphasis on selected elements of characterisation. So characters with exaggerated, Gollum-like physicality aren’t real, but the performance capture foundation enables them to be real enough that we can believe in them.

Whereas all mo-cap contributes to characterisation, not all characterisation is drawn from mo-cap. It’s worth acknowledging that a great many other ingredients such as texture, lighting, audio and dialogue contribute key layers toward the depth of character. Mo-cap provides a foundation upon which further details of characterisation are built.

One clear focal point here is in facial performance and this too is evolving.

“Facial Mo-cap is advancing in its accessibility and technology”, says Realtime UK’s Will Eades. “Combined with hi-res scanning of actors to create extremely realistic 3d models and rigs the characters in games are continually becoming more detailed and believable”.

However, facial animation provides some unique challenges, as Michael Berger – co-founder of Speech Graphics describes: “The movement of the face consists mainly of soft tissue deformation caused by surface muscles, and there’s a lot of room for variation. It’s tricky to map facial motion captured from an actor onto a 3D character. This is a distinct process called ‘retargeting’, and it’s as much an art as a science.”

Michael’s answer to this has been to use the audio signal to drive the face via an internal, procedural model of facial movement. Clever stuff.

As well as close and detailed nuance of the face, mo-cap also has a role in wider-shot realism, as Eades continues: “Some mo-cap studios can now capture up to 18 actors at once which would massively help in crowd animation that needed to interact which each other – a no-brainer for a sports game developer for example.”

Believability can’t only be limited to key characters, if a group scene provides a flimsy environment the gameplay experience would be significantly compromised not matter how solid the protagonist’s character.

There are examples where mo-cap makes less sense, and keyframing still provides the kind of performance you need, such as the outstanding and slightly stylised animation by James Benson on the critically acclaimed Firewatch.

Eades agrees mo-cap is a tool in the box rather than a silver bullet: “As an animator I know that although the technology is becoming more and more accessible there will always be the need for keyframe clean-up and adjustments to enhance the performance”. 

Berger agrees: “There is more involved in motion capture than simply ‘capturing motion’.”

Having come so far in its contribution to character what’s next for mo-cap?

“The increased accessibility of Mo-cap couldn’t have come at a better time, coinciding with the second coming of VR” says Andy Nye, managing director of New Moon Games. “We’ve found that when you’re placed in a VR environment, the increased realism of your interaction with the virtual world is significant.

We can’t wait to see this next level of mo-cap’s powerful role within providing character depth.

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th