From the naturalistic movement of Assassin’s Creed hero Altair through to the lumbering movement of BioShock’s Big Daddies, gamers have never been so spoilt on new personas to take on or enemies to battle – all of them moving realistically.
Developers have never been so spoilt with tools and technology to enable such beings to be created, either – and it’s through a unique intersection of quality texture, design, artistry, physics, AI, motion capture and processing power that makes the above heroes and villains come to life.
We’ve covered each of the above individual fields in Develop before, but of course the key comes in bringing all those components together at the animation stage in order to bring characters to life. Here, we profile four of the key technologies powering character animation today…
Having become a key part of the landscape for next-gen development middleware, you’ll find Havok’s logo on the box of a number of the year’s key games releases, from Assassin’s Creed to Halo 3.
When it comes to the character animation, a number of the the firm’s tools are key candidates to support it, specifically Havok Animation and Havok Behavior.
While the former lets artists and animators construct animations of both characters and objects, its the latter’s use of event-driven behaviours which then sequences them together, helping you construct sequences and movements between states via the UI.
Recent upgrade Havok 5 includes what Havok says is the first unified authoring environment and SDK to combine the power of physics with procedural animation and traditional animation assets – a key development given the close watch many are playing to procedural content, especially at a time when someone like NaturalMotion (profiled right) offers a run-time procedural engine for character animation as an alternative. Havok claims that the kit fits into a space between pure asset creation in something like Max or Maya and coding, creating a cross-over between artistry and programming.
Possibly the biggest draw, however, isn’t those updates but the fact that both Animation and Behavior are tightly integrated out of the box with Havok’s bread-and-butter Physics middleware.
NaturalMotion’s character animation comes in two flavours: an animation engine Morpheme, and dynamic motion synthesis technologies Endorphin (the 3D tool for procedurally generating characters) and Euphoria (which takes the DMS technology to runtime, doing away with canned animations).
This three-pronged approach goes against the grain in some respects, with the DMS technology specifically being what NaturalMotion is most famous for. Given that the technology aspires to do away with the lengthy process of key-frame animation and/or mocap, the sales pitch and allure is obvious.
Endorphin’s most recent update beefed the program up to include a range of new features, such as fluid interactions, a control panel plug for Autodesk Maya, improved hold behaviours and more file format options and enhanced resources – but it’s the fact that runtime technology Euphoria has found key partners at LucasArts and Rockstar which proves the company is on the right track.
Morpheme, meanwhile, comprises of a runtime animation engine, optimised for PS3, Xbox 360, PS3 and Wii, featuring support for blend nodes and a hierarchical animation state machine. Also included is morpheme:connect, a graphical animation authoring environment to author and test designed transition logic and blends with instant feedback in a full 3D environment.
While Autodesk continues to claim the lion’s share of the asset-creation corner of games development with its Max and Maya art packages, it’s its MotionBuilder technology that is key to character animation, pitched as key for taking assets and transforming them into moving characters using key frame animation, mocap data and the like.
The biggest upgrade of late has been MotionBuilder 7.5’s update with Extension 2. A key new feature is a new Biped Template lets users quickly rig a biped character, with forward and inverse kinematic full body manipulation rigs automatically created based on the size and proportions of a character, with automatic resizing options based around the proportion of intended characters.
There’s also an acknowledgment of artists having to make hundreds of on-screen characters with speed-focused updates that let animators create custom rigs which can be saved and then connected to another character, regardless of size or proportions, which can help give a head-start to replicating basic movement points in new models. The Extension also adds a finger solver for mocap data and other efficiency improvements.
Like Havok, Autodesk’s strength isn’t just its one piece of software, but rather where it sits in a portfolio of art products. It also offers bespoke inverse kinematics middleware for character movement, Human IK, alongside its 3D art tools.
SOFTIMAGE XSI/FACE ROBOT
For Softimage, the big selling point is how XSI provides an alternative to Autodesk’s style of mutliple-programs with an all-in-one solution for the art and animation process.
Rendering and animation tool XSI recently hit version 6.5, with it bringing a number of new enhancements. Another key element to the update has been specific tailoring of the software to fit with the needs of game studios – and key partners such as Lionhead, Valve and NCsoft have reportedly helped offer feedback and champion the software.
But ask any artist or animator using XSI what the main benefit of the software is however, and the answer will usually be about the software’s variety and flexibility, specifically when it comes to featuring elements like non-destructive character modeling and MOTOR – XSI’s motion rigging element – alongside model manipulating tools and the fact it isn’t afraid of letting users swap assets in and out of Max and Maya pipelines.
Further competitive edge, however, is offered with Face Robot, because of course all characters need properly moving mouths and opening eyes. The latest version has been specifically updated specifically for game production pipelines and to slot in between whatever your art app or game engine of choice may be. Finally, game engine export tools let users get their facial animations straight into games using current methods such as skinning and normal mapping.