Microsoft’s eagerly awaited controllerless input device, Kinect, will arrive in shops this week and offer the market what Microsoft deems ‘a whole new way to play games’.
It’s also, of course, a whole new way to develop them – something that, in an industry that habitually builds on its own tech and templates, can leave studios in a sweat.
NaturalMotion believes it has found an opportunity in the circumstances, and will soon launch new Kinect-specific module to its animation system Morpheme.
The company’s CEO Torsten Reil (pictured) explains more about Kinect for Morpheme.
Why introduce your Kinect module?
Kinect for Xbox 360 represents a completely new way to interact with your game. To make this kind of technology really shine, you have to integrate it seamlessly into your game animation system.
This is what our Kinect module for Morpheme does. It means that animators and programmers can graphically author how Kinect data is used on their characters, and it allows them to use all the features in Morpheme like physics, IK, or advanced blends. What’s also cool is that you can prototype Kinect game ideas or controls quickly in Morpheme without having to get a dedicated engine going.
How does the tool work, in terms of Kinect detecting motion and that being translated to game animation?
The Kinect system runs live alongside our Morpheme runtime engine. The Kinect module in the Morpheme runtime retargets the live motion data onto your character and from there can be treated like any other animation data. For authoring, our Morpheme Connect tool presents a simple Kinect drag and drop node in the animation blend tree.
This means, for example, you can apply Kinect to only the upper part of your character, whilst the lower part is driven by a walk cycle. Or you use the player’s shoulder tilt to drive the direction of the walk cycle.
Microsoft provides this technology itself. Why pay for more?
Microsoft provides the Kinect technology, SDK and core dev tools. NaturalMotion provides two important pieces to use Kinect in a game.
Firstly, dedicated Kinect algorithms for retargeting and noise reduction.
Secondly, there’s a tight integration into a graphically authorable animation engine. In our experience, both are required to get the most out of a live motion input system.
What has the feedback from Microsoft been like?
Great – Microsoft has been very supportive from day one, both in terms of sending us hardware, as well as using and testing our technology over the past few months.
It’s clear that Kinect is crucial to Microsoft’s Xbox strategy, and it’s great to see how it’s supporting the development and tech ecosystem around it.
What is the key benefit of this tool for animators? Does is it take workload away from programmers?
Yes, the key benefit is that animators can control how live body motion is integrated into their animation networks – a key component in creating immersive live motion experiences.
Equally, programmers benefit from ready-made motion processing algorithms, such as retargeting and noise-reduction algorithms.
Is the module competitively priced? How many studios are you hoping to adopt the tech in two years?
Kinect for Morpheme is very cost effective in terms of the time and money it saves both compared to creating a similar solution in-house, as well as in terms of sheer reduction in iteration time.
The tech has already been deployed in multiple studios, so we’re very optimistic for the months ahead.
Who’s your intended market? Independent studios?
It’s both publishers and independent studios. Right now, we probably have a few more of the former than the latter, but that mainly reflects hardware availability.
If developers use Morpheme’s animation module do you think it can improve lag times in play?
Lag times in Kinect are just not a problem with the right filtering algorithms, and with judicious use of motion input in different situations.
Morpheme helps with both, and it lets animators and programmers adjust and experiment with all the necessary settings graphically.