If you’re intrigued about the gaming possibilities presented by gesture and voice controls, take a look at this case study explaining how PlayBuff used the Intel RealSense SDK for its game Death Drive: Racing Thrill. The game can now use Intel RealSense cameras to detect players’ hand movements and voice commands, and respond accordingly.
To make the game a more visceral experience, PlayBuff enabled players to steer the car with their hands (clenching the left fist to turn left, and the right fist to turn right, with both fists braking). Voice controls were also added that can be used for firing. I can see how players that take advantage of these controls could feel more involved in the game, and be able to more easily control motion and firing simultaneously.
To acclimatise players to the new controls, PlayBuff included a tutorial that introduces them gradually. Players’ hands would sometimes drift out of the detection zone during play, so the game identifies when this happens and asks players to bring their hands back into the detection area so the game can be recalibrated. Players often oversteered, so the developers added damping to avoid this spoiling the game experience.
Read the full story to find out more about the technology used, the key lessons learned, and to read some sample code used for gesture detection. If you’re feeling inspired, read the RealSense UX Design Guidelines to explore how you can use gestures in your software to improve the user experience. You can download the Intel RealSense SDK for free here and find out more about it at the Intel Developer Zone.
• This blog post is written by Softtalkblog, and is sponsored by the Intel Developer Zone, which helps you to develop, market and sell software and apps for prominent platforms and emerging technologies powered by Intel Architecture.
For the latest Intel Developer Blogs, click here.