As perceptual computing advances and the human-computer interface breaks free from 2D screens into the 3D world around us, users are enjoying more natural interactions with their devices through gesture control. And with 3D cameras such as those that work with Intel RealSense technology picking up increasingly subtle movements, facial recognition is opening up exciting new possibilities that have got developers grinning at their laptops.
Gaming is an obvious arena where advances in facial recognition will enable new forms of interaction. By mapping and tracking users’ features, facial recognition APIs enable gamers to see their own faces grace their on-screen avatars, reacting in near-real time. One exciting avenue for developers in the future is the refinement of facial recognition technology to such a degree that even the subtlest signs of a user’s mood or level of engagement can be read – enabling the application to react to the slightest change.
And fear not, facial hair-wearers: tools like the Intel RealSense SDK can be used to implement face, pose and expression detection, even for the more hirsute user. There are plenty of resources for app-builders available on the Intel Developer Zone, including this face analysis tutorial (PDF) and this article about face and head tracking using the Intel RealSense SDK.
For a fun look at these processes in action, check out Blockhead, an application that tracks your face and mimics your expressions. And if you’re feeling adventurous, download the sample code and play around with it yourself.
• This blog post is written by Softtalkblog, and is sponsored by the Intel Developer Zone, which helps you to develop, market and sell software and apps for prominent platforms and emerging technologies powered by Intel Architecture.
For the latest Intel Developer Blogs, click here.