If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

The promise of a game world you can touch

"Going to the Feelies this evening, Henry?"

I'm led into a room containing a bank of computers. I'm ushered, cheerfully enough, to the first station. There's a mouse, a keyboard, and something keyboardesque, but square, and without symbols on each of its black keys. On the monitor is a basic, 90's-looking floating-bubble screensaver. I'm not sure this is what I expected. I'm instructed to put my hand in the air in front of me.

As the bubbles on-screen begin to pop, I simultaneously feel the bubbles popping on my skin. My bare skin; I'm not wearing gloves or any other equipment. More precisely, I feel each individual bubble roll against my skin, pause there delicately, wobble, then vanish. It's not like a random puff of air. This isn't Pirates 4D. I hold my hand out, chuckling like... well, like a child first experiencing bubbles. It's great.

Time's up and a swift lateral bum-shift into the next chair and the next station. There are more little miracles to be found. "Am I meant to feel nervous?" I ask, "because I am, a little bit."

I put my hand out in mid-air. Onscreen, it's pretty clear what I should do. It's like an in-car entertainment system interface. Music, navigation or driving setup. I swipe my hand in the direction of "music", and music starts playing. And what's more, I feel the different panels - even down to the satisfying little clicks when a function is selected. I press buttons, delighted at how delightful pressing buttons is. And there are no buttons!

Whizzing through the demos now: onscreen arcade top-down shmup, hold out my hand, feel the ship under my palm, experience each enemy missile hit as a shudder across my skin. Next one's a bit different - a movie poster, not a game. I'm not allowed to talk about the movie (non-disclosure agreements are strong), but I can tell you that I feel sparkling electricity spreading from my wrists to my fingers, thrumming with building energy, pulsing. Then, by outstretching my fingers, I release this pent-up energy into the screen and watch in real fear and regret as the electricity zaps the little green character in front of me."I'm so sorry, I didn't want to really hurt him!" I say. The people guiding me round nod, smile knowingly at each other. They tell me it's alright, the little green character is completely unhurt.

In other words, I'm at Ultrahaptics, in the company's offices in Bristol, learning more about how haptics can add layers of immersion to games.

Haptics are really interesting, but you could be forgiven for not knowing what they are exactly. It's essentially any technology that creates input for our sense of touch. Haptics have become synonymous with VR experiences, but while it's true that these technologies play very well together, they aren't always used in tandem.

The most recognizable example would have to be your mobile phone: when it buzzes in your pocket, you've got haptics to thank for how it feels. Haptics first came to consoles with the N64's Rumble Pak, and today almost every console has some kind of rumbling controller feature. The haptics in our console controllers are pretty basic iterations of the idea, but can have profound effects on your perception of gameplay. And what's more, our brains seem primed to receive this physical information and do magic things with it. In racing games, for example, notice how the haptic motors in the controller vibrate at different frequencies when you're driving around the course. Your brain takes this information and makes it into the contrasting textures of gravel and tarmac. The rumble feature, without context, certainly doesn't feel like tarmac. But when added to the visuals, the sound, the experience, it's satisfyingly realistic.

So, what could your brain do with more complex stimulus? Modern media has explored this idea in detail, with the novel and film Ready Player One being a great example. In the film, the player interacts with a virtual world using a full bodysuit that is stuffed with sensors and haptics. The idea is the player can feel every sensation that their avatar experiences in the virtual world, from heat, to texture to actual physical blows. It's an enthralling concept, full immersion, and the reality is that we're edging ever closer to these haptic dreams.

And that's why I've come to Ultrahaptics, and found myself poking and prodding at invisible buttons in the air. Ultrahaptics are one of the pioneers in mid-air haptic feedback systems, which is slightly different from the technology depicted in "Ready Player One". Basically, a camera tracks your hand in space, then small "speakers" in an array emit ultrasound (which is just like regular sound, pitched higher than our ears can detect) at just the right frequency to vibrate the air around your skin to create a range of feelings, no bodysuit needed.

It's smooth as you like during gameplay, but apparently this hasn't always been the case. Adam Harwood, the team lead for capabilities, tells me that before he started with the company, in the tech's first version, it took a day to do all of the mathematics needed to render just a single point in space. "A lot of optimisation has gone into the product. In the latest platform, Stratos, all of that computation happens on the board itself, so it's just a couple of milliseconds on top of whatever else you have running."

The sensations in the demo are impressively varied, and it's strangely comforting to know that it's my brain, not just the fancy tech, that does a lot of the heavy lifting. Tessa Urlwin, who works in marketing and PR at Ultrahaptics, tells me that people who have used the equipment report all sorts of sensations that are physically impossible. She describes how, while playing a demo involving fireballs, her own hands started feeling hot, even though the technology doesn't emit heat. "It's a really powerful effect," Harwood adds. "If two senses say different things, it can be uncomfortable, like car-sickness, or one sense can win out. We tend to believe our eyes over other senses, so our hands sometimes feel what our eyes are telling them to feel."

As well as the range of sensations available, I'm interested in the strength of them, too. When I ask, Harwood smiles self-consciously, "we haven't been able to do pain. I did briefly look into it, because there are some interesting potential applications, like in scientific research. But we haven't tried very hard on this subject yet."

On a lighter note, I'm pleased to hear that the team's also been working on using the technology to help people with different physical needs. Urlwin explains that, "we've been doing some accessibility work with "Accessible Ollie"; an adapted bus. We made a bus stop button that came to you when you put your hand out. It was great. We had Stevie Wonder come and use it."

And when I ask about games? Actual play-at-home games? The self-conscious smiles returned. "We have some ongoing partners," Urlwin says, "but watch this space because there are lots of things we're not allowed to talk about."

"We're currently applying for [grant funding] for a project with a very large European game developer," Harwood adds.

And it's here that I should admit something. Throughout this visit, Urlwin and Harwood are both warm, engaging people and they are clearly putting a lot of effort and thought into their responses. And I'm not a rude person, generally. But towards the end of our interview, my thoughts keep being pulled back into the demo room next door. I apologise if I seem distracted. But that final demo!

For the final demo, I am fully immersed in a VR headset. There's a table in front of me, a spellbook and a crystal ball. I have to mix ingredients and hold my hands in front of the crystal ball to power-up the spell. Then I can use the spell to kill the horrible little insects that crawl over the desk in waves. Sometimes it's an electricity spell, sometimes it's fire. It's totally entrancing - and surprisingly emotional. My brain takes this extra layer of haptic input and from it gives me joy, and delight, and even disgust. VR plus mid-air haptics are clearly more than the sum of their parts. I've never been happier to turn a page, to select a bottle, to mix a potion. Could this be the end of 'hold x to do y'?

Obviously, physical touch doesn't always create emotion towards an object. But numerous studies have shown that physical touch is a fundamental tool our brains use when forming emotional relationships. The World Health Organisation advises that babies born by C-section are given skin-to-skin contact with their mothers as soon as is safely possible, to start maternal bonding. Friendly dogs are routinely brought to care homes because it's widely acknowledged that stroking a pet is good for your mind and body. Doctors recently researched the optimal speed to stroke a baby's skin to reduce pain (it's 3cm a second, and it really does reduce brain activity in pain-centres).When something hurts, we put our hands on it, instinctively. I can't help but wonder what incredible therapies could be unearthed by mixing calming, mindful gameplay with soothing touch sensations.

But I'm a bit worried, too. Imagine spending hours interacting with a haptic-enabled virtual pet. Your eyes are fed VR information on what the pet looks like. Your ears pick up what your pet sounds like. And now, your skin receives stimulus that tells your brain what it feels like to touch. There aren't many senses left to remind your brain that your virtual pet is indeed virtual. Your physically-enhanced interaction with this pet has piggybacked onto thousands of years of evolutionary wiring that connects our bodies and our brains to the world around us. How would it feel if that pet were programmed to show hunger, or distress? If this pet were to be placed in an even mildly-threatening gameplay situation? But thankfully, on balance, these worries are more than offset by the myriad of positive possibilities that haptics bring.

In an age of story-driven blockbusters like The Last of Us, The Witcher and Red Dead Redemption, we're used to the fact that games can reach out of the screen and make us really feel something. But someday soon, if haptics are embraced fully by the industry, games might be able to reach out of the screen and make us really really feel something.