We catch up with the studio's head of middleware to discuss the tech's biggest ever update

Tech Spotlight: NaturalMotion’s character animation tool Morpheme 7

NaturalMotion released the 7th version of its character animation tool Morpheme last week, and with it came a number of key new features.

Additions include the introduction of full facial animation to the tech, Morpheme Dynamics, which allows node-based dynamic simulation of joint chains and optimisations for Xbox One and PS4 development.

We spoke with the company’s head of middleware Steve Thompson about the array of new features, and what they mean for game developers.

Why have you introduced full facial animation into Morpheme 7?
Morpheme has been used for years to generate compelling motion for characters. However, until now our clients needed to use other technologies for driving facial animation, leading to multiple authoring experiences and disjointed in-game movement.

The ongoing drive for higher fidelity characters has pushed people towards more unified performances for their game characters. Having an integrated solution for the entire character yields substantial benefits, both in the production process and in the final result.

How does Morpheme Dynamics work and how will it benefit developers?
Morpheme Dynamics is a solution for adding secondary motion to elements of the character that should be influenced by physics. This includes everything from simple attachments such as water bottles or ammo clips to more complicated chains of joints for things like ponytails, belts or straps.

Traditionally this would be either baked in animation or it would require a global physics solver to achieve the secondary motion. If you’re not already using physics within your game it can be onerous to incorporate a complete physics engine to achieve some secondary motion. If you are already using physics, there can be complications introduced from interactions between what is essentially an aesthetic animation effect and your gameplay physics.

Morpheme Dynamics is a single node that you can use like IK or any other procedural animation technique. The node itself calculates the physics local to that character allowing you to tune the aesthetics in isolation without needing to consider how it will interact with other physics objects.

Of course this solution easily interfaces with our existing integrations with more traditional physics engines, for example adding a ponytail to a ragdoll, without the global ragdoll solver having to solve the much longer chain of joints.

What does the inclusion of a highly specialised State Machine and blend tree nodes mean for developers wanting to include and animate crowds of on-screen characters?
Morpheme is designed from the ground up to be very flexible, allowing users to build animation networks in any way they like. The disadvantage of this is that you do pay an overhead for the flexibility as there can be no assumptions about how people have built their networks.

By introducing specialised State Machines, we have made it possible to embed new light-weight elements within the familiar authoring environment of Morpheme Connect. These elements can impose restrictions on the underlying structure of the network, allowing us to make optimisations in the game code that result in a reduction of both the CPU cost and the memory overhead for that region of the network.

What other new features have you added and what do these mean for developers?
Something we found when developing the facial animation workflows was a need to have better tools when working close up with parts of the character rather than just looking at the overall motion. We needed to invest significant resource into our camera, viewport and debugging workflows, extending the tools at hand for drilling down into what is happening within the animation network.

We’ve also added the ability to perform Static Analysis on the network itself. This customisable offline process examines the construction of the network and reports any problems it detects and suggests ways in which it can be improved, either for robustness or for runtime cost.

Morpheme v7 is the first major version where we have concentrated on what is now current generation hardware. This focus has allowed us to strip back parts of the runtime architecture used by legacy platforms yielding not only simpler code, but also improved performance and memory usage. This is an area where we’ll continue to innovate, but these first steps are significant.

How has Zynga’s acquisition affected Morpheme’s development?
NaturalMotion has always kept a healthy separation between our middleware licensing and our proprietary games business, and that hasn’t changed since the acquisition by Zynga. However, being part of a larger company gives us access to experience and facilities that we haven’t had previously, such as motion capture suites, increasing the avenues that we can develop in in the future.

Adding facial animation is the biggest growth in scope of Morpheme since its initial release over seven years ago. We’re excited to see how people will use the powerful integration of full-body and facial animation, plus local dynamic simulation, to bring new levels of believability to their characters.

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th