Meta’s new VR hand tracking feature almost feels like you’re touching the future

0
184

Meta is testing what could become a fundamental upgrade to its Quest VR headsets: a way to tap and scroll over virtual elements with just your hands, without the need for controllers. The idea is that you can perform actions you may already know from your smartphone, such as swiping up and down a page, pressing a button to activate it, or typing on an on-screen keyboard, with just your fingers in the air.

The new experimental feature is called ‘Direct Touch’ and is part of the Quest v50 software update now rolling out. After weeks of waiting, the update finally arrived for me, so of course I turned it on right away.

With hand tracking enabled, the Quest 2 uses its outward facing cameras to track your hands, and in the headset you see them in VR as dark hand-like shadows. (CEO Mark Zuckerbergs Direct Touch video, which appears to be taken from a Quest Pro, shows more hand/arm detail.) You can use those shadows to estimate when your hand will “touch” a menu or window in front of you. With Direct Touch, things will start to scroll or light up when you “make contact”. Scrolling is jerky, but it’s usually more responsive than I thought.

However, typing with Direct Touch is worthless. When you tap on any part of the UI where you can enter text, the Quest’s on-screen keyboard appears below the window and you can “press” individual keys to spell things out. But since there’s no physical place to rest your hands or fingers, it’s hard to have any idea of ​​where – or what – you’re actually typing. (Imagine the lack of feedback you get with the iPad’s onscreen keyboard, then imagine there’s no glass.) Even if I resort to VR hunting and pecking to say even a single word in vain, writing, the UI sometimes thinks I pressed a different word key than the one I meant. Fortunately, the keyboard does suggest words as you type, which can help in a pinch.

Poor typing and decent scrolling make the Quest web browser arguably the best showcase of the Direct Touch controls. If I twist the spelling of an internet search, the search engine will probably fix it. Scrolling up and down works well enough, as does tapping links. Weird, The Verges homepage doesn’t scroll past our Top Stories list in the Quest browser for some reason, but tapping on one of the six stories I can see works better than I expected.

If you want to see me actually trying to use the browser, I filmed it for you:

Most of the other built-in Quest apps I’ve tried were at least usable with Direct Touch, but many apps from the Quest Store, including those from Meta Horizon worlds VR social network, not updated to work only with your hands. They wouldn’t even open unless I had a controller. I certainly wasn’t expecting apps like this Defeat Saber are better when I didn’t have a controller, but I was hoping I’d at least have the ability to mess with them.

At this point, it’s clear why Direct Touch is called an experiment. With every poke of the air, I can’t quite trust my hand to actually “touch” a virtual piece of the Quest’s UI, so using it for more than a few minutes at a time quickly becomes frustrating. Extending my arms in space to move around the UI also gets tiring after a while. Meta’s other controller-free hand gestures, which involve squeezing, are generally more reliable, though I find them less intuitive.

The idea of ​​Direct Touch is extremely cool

That said, I still think the idea from Direct Touch is extremely cool. Scrolling and tapping on virtual surfaces in my VR headset makes me feel like I’m living out some sort of sci-fi dream, even as my words per minute plummet by 99 percent and I don’t think any of my taps will work the way I expect. Also, if Direct Touch works as intended, using my hands is much more convenient than using the Quest controllers. I know this is a big star, but simply putting on the headset and scrolling through something with my hands removes a lot of the friction I normally associate with setting up the Quest. (That said, because Direct Touch is so finicky, I still have to make sure the controllers are nearby.)

It’s also clear to see where this technology could go, especially if Meta’s AR glasses that are years away actually come to fruition. While wearing those goggles in the world, you probably wouldn’t also want to have a controller or two when you could just use your hands. And we may not just be working with Meta devices with our hands in the air; Apple’s elongated mixed reality headset can let users type on screen keyboardsso it seems possible that Apple is also exploring these types of interactions.

For now, I’ll stick with using the Quest controllers for the most part. But if I need to quickly check something on my headset, I can leave the controllers on the table and try to do it with my hands instead. It might take three times as long, but it’s a lot cooler.