A new technology that lets players feel and interact with shapes and textures made from thin air is ready to help game designers make their worlds more tangible

Getting to grips with UltraHaptics

For all that games have achieved in their history, they continue to serve the human senses with significant prejudice. While the eyes and ears of the player are catered for absolutely, the other tools we use to understand the world are largely ignored.

The reasons are obvious; catering for the touch, taste and smell senses is practically and physically troublesome.

However, serving the ability to feel – otherwise known as samatosensation – is becoming increasingly interesting to developers, particularly as the post-Oculus vision of virtual reality dawns.

Thus far players’ experience of haptics is typically limited to device vibration, be it in control pads or through touch screens. Applied wisely, traditional vibration can serve games well, but in the immersive world of VR, where it is so easy to disconnect a player from the experience, something more is needed.

It’s the reason variously expensive and bulky physical haptics solutions have emerged, often in the form of gloves or suits that help players not only see and hear virtual worlds, but in some way, feel them. Yet at a time when there are already questions around just how many consumers will be willing to invest in a VR headset, the fortunes of such concepts in the productised realm remain a mystery. Even the most affordable of such solutions have struggled; see the ARAIG haptics suit, which fell some way short of it’s $900,000 Kickstarter goal.


What’s needed is a elegant, workable solution that doesn’t intimidate or financially exclude ordinary players; a concept Bristol outfit UltraHaptics may have nailed with a technology that uses ultrasound to form tangible shapes and textures from thin air, that the user can feel and interact with without the need for any worn equipment.

"Virtual reality is getting more and more immersive, and sometimes you can really lose yourself in the games," offers UltraHaptics co-founder and CTO Tom Carter (right), by way of introduction. "It’s becoming completely believable. But now that the visuals and the audio are that good, the jarring moments when you are using virtual reality is when something really obviously not real happens."

Carter, who took the UltraHaptics technology concept from undergraduate computer science project to a PHD and on to a functioning company, points to a VR experience that will be familiar to many that have tried virtual reality.

With a headset strapped to your face, you lift your real hands in front of your eyes. In the virtual world, your avatar’s hands stay firmly put. It’s a problem being solved by various tracking technologies, but even then there’s a challenge.

"The next problem is that now you can see you hands in virtual reality, but when you reach out and touch something you can’t feel it," states Carter. "That will be the next fall down of immersion, as far as I can see."

UltraHaptic’s technology, as such, is Carter’s retort to that problem. He first discovered the concept when considering a research topic for the closing months of the aforementioned undergraduate course at Bristol University. That was around 2011, when he partnered with professor Sriram Subramanian, a human-computer interaction specialist with an innovative idea about the potential of ultrasound. Over time Subramanian and Carter worked on the technology, developing and iterating until they came up with the hardware currently causing something of a stir at games industry conferences, technology expos, and even a few lucky studios.

But how does it work?


"We use ultrasound to create vibrations on your hands’ skin," says Carter. "So you can hold your hand out to interact with any kind of device or game. What we do is, we have a small collection of ultrasound speakers and we focus their sound waves onto your hand in really precisely controlled ways. We can create different shapes and different patterns, and also create different vibrations so you can feel different textures and different sensations."

The build tested for this feature featured a slab of PCB approximately eight inches square, covered with a neat grid of finger-tip sized ultrasound ‘speakers’ – ones very similar to the piezoelectric type used to produce the grainy ultrasound scan images expecting parents pull so delightedly from wallets.

In the tested rig, the UltraHaptics panel – or to be more accurate, the transducer array – sat affront a laptop on a desk, and was rigged up to a Leap Motion gesture sensor. As for the effect? On first trying UltraHaptics it feels close to magical. Playing a Pong clone on-screen, using the hand to gesture control the paddle, it was strikingly clear. You could feel the ball make ‘contact’ with your hand. Pass the hand through a force field; no doubt it was present, including an upward current of movement.

And Ultrahaptics have other demos to; textures you can feel, shapes that hover, and even interactive invisible UI; dials and switches made from thin air that the user can both feel and operate – as long as an appropriate gesture tracking hardware is present.


The ‘objects’ of course, are made of nothing more than vibrations in the air. You can certainly feel them, yet their presence is certainly delicate. They are suggestions of form, if you like, rather than rigid shapes, but one thing is certain; you can feel them.

"If you want to punch a wall in virtual reality our technology isn’t going to stop your hand moving at full force," says Carter. "You’re actually pretty strong. But what we can do is give you that tactile feedback to let you know that you’ve touched an object or passed through it. And in terms of light objects in the virtual world, if they have physics and you touch them and they move, we can do things around that."

That alone presents game designs and artists with significant opportunities for innovation. There’s the chance to communicate through the player with sense, letting them know they are close to a goal through ultrasonic vibration, for example. Or perhaps a way to improve the sensation of gesture controls by adding the tangibility so apparently absent from the likes of many Kinect experiences. And there’s the opportunity to let players interact through that ‘invisible UI’.

But for Carter and his colleagues, there’s no rush to tell developers how to use the tech, as they foresee a period reflecting game design advances in the current generation of AI. The grammar of ultrasonic sound as a sensory control and communication must now be established.

"We want to enable ideas, rather than dictate how it should be used," confirms Carter.


For now, the hardware is making its way to a few select collaborators. An API to program the device and communicate its feedback is established, and what Carter calls a "workable" Unity binding is close to ready. There are also tools underway that should allow designers and artists to rapidly prototype with UltraHaptics, and options with other game engines and middleware are being explored.

The team is also pondering the potential in terms of the product’s consumer form. As it currently exists, the UltraHaptics transducer panel could be used as a coffee table or even desktop platform. But there is also talk of the possibility of a smaller, longer range device that could sit on or by the TV, presumably jostling with Wii U sensor bars, PlayStation Cameras and the like. There’s even talk of alternative roles, perhaps in the kitchen where food covered cook’s hands are too messy to use physical controls.

And, though one of UltraHaptic’s strengths is that it doesn’t encumber the player with on-body physical peripherals, Carter sees that a wearable version may work, either as a modest addition to a VR headset, or even a distinct, small independent device.

"For now the main challenge for us is establish exactly how the technology is going to be used in different devices, and, for example in gaming, just what the interactions should be," elaborates Carter. "Are we going to try and simulate every object and every texture in a game’s whole world, or will people interact through something like feedback to gestures within the virtual world? We have to work out what those dynamics are, and how this technology fits together perfectly."

The months ahead, then, are reserved for iteration, moderation and experimentation. Exactly how – and indeed if – UltraHaptics will become part of modern gaming remains something of a mystery, but the platform certainly boast great potential.

It is absolutely functional, striking to use, and in terms of introducing a new sensory element to gaming, a fascinating opportunity for today’s developers.


About MCV Staff

Check Also

nDreams is increasing all salaries by £1,000 to address UK’s rising energy bills

nDreams has announced a plan to increase salaries to help its 180 staff cover their increased winter fuel costs