Originally created as internal tech at facial animation outfit CaptiveMotion, Embody Animation is now available to all developers.
Having recently been released as a free trial, Embody has been conceived to allow studios to retarget any facial animation from one mesh to another, all the while maintaining the detail and nuances of the original performance or data.
“While this may sound difficult, the entire process only takes about an hour,” says James Comstock, VP of engineering and production at CaptiveMotion.
“Embody Animation is mostly used to retarget facial mocap, but any animation can be retargeted. The tools are incredibly flexible, and you can retarget from very different meshes.”
That means a user can retarget a facial animation to a photo-real human model, or even retarget to a roughly humanoid shape, to more varied forms like animals and aliens, and to animated 2D characters.
UNDER THE BONNET
Clearly designed to be flexible and adaptable, Embody Animation also enables developers to produce a standard bone animation optimised for use in-engine.
“Our most important features are that we support retargeting animations to either a bone rig or to a mesh. If you retarget to a bone rig, the animated source mesh will be used to drive the bones of your character,” explains Comstock.
He later added: “If you are looking for the highest quality animation, say for a pre-rendered cutscene, then you can retarget directly to a mesh. In this case, the animated source mesh will be used to drive the vertices of your character directly.
"We also include advanced skinning tools that make it quick and easy to produce high-quality facial skinning for rigged characters.”
A few years previously technology like Embody Animation would be almost exclusively the reserve of large triple-A studios.
But with the ongoing democratisation of both mocap hardware and techniques, CaptiveMotion has responded to change, making sure the tool is accessible to the new wave of smaller studios embracing motion capture.
“We designed Embody Animation so that it can be used on large projects that require hours of high-quality facial animation. In this capacity, it works great. However, it works just as well on small projects,” insists Comstock.
“It also handles data of any fidelity. You can retarget facial mocap data produced by an optical mocap system using 50 reflective markers, or by Embody using 1,500 markers, or anything in between.”
ON THE PULSE
The rise of accessible mocap isn’t the only trend considered in developing Embody. It’s also been created to solve the problem of the fact that there are still no standard processes for retargeting the facial mocap animation from an actor onto an in-game character.
“You’ll either need to develop your own in-house solution, or give up creative control and hire specialised external resources,” says Comstock of a world without Embody technology.
“With Embody Animation, you can now handle retargeting internally. This means you won’t have to build your own solution and can still maintain creative control.
"And since it only takes about an hour to retarget a character, you’ll be able to do everything quickly and cheaply.”
Designed to work with any animated data, Embody Animation provides what its makers promise is a unique offering.
“We weren’t building software that was an iterative improvement over an existing product,” says Comstock of crafting Embody Animation.
“Rather, we were attempting to solve a new problem in a general case manner, and we had to invent completely new techniques to do so.”
It took Captive Motion almost four years to get Embody Animation to the point where it could stand on its own as a retail product.
Now that’s done, the challenge is in helping developers understand how to leverage Embody Animation to make a paradigm shift in how they manage their facial animation pipeline.
That change is one that could save studios money and increase the quality of their work, and if that’s the case, the challenge for CaptiveMotion should be an easy one to overcome.