I Witnessed the Future of Smart Glasses at CES. And It’s All About Gestures


In a corner of the bustling showroom floor at CES 2025, I felt like an orchestra conductor. As I subtly waved my arm from side to side, notes rang out on the cello displayed on the giant screen in front of me. The faster I moved my arm, the faster the bow slid across the strings. I even earned a round of applause from fellow booth attendees after a particularly speedy performance.

This is what it felt like to use the Mudra Link wristband, which lets you manipulate devices using gesture controls. Motion controls aren’t new; I remember using touchless controls as far back as 2014 with devices like the Myo armband. What’s different now is that gadgets like these have a bigger reason to exist thanks to the arrival of smart glasses, which were seemingly everywhere at CES 2025.

Startups and major tech firms alike have been trying to make smart glasses happen for more than a decade. However, the arrival of AI models that can process speech and visual input simultaneously has made them feel more relevant than ever. After all, digital assistants could be much more helpful if they could see what you’re seeing and answer questions in real time, much like the idea behind Google’s Project Astra prototype glasses. Shipments of smart glasses are expected to grow by 73.1% in 2024, according to a September IDC report, further indicating that tech-equipped spectacles are starting to catch on. 

Read more: Nvidia’s CEO Explains How Its New AI Models Could Work Future Smart Glasses 

Watch this: These New Smart Glasses Want to Be Your Next AI Companion

Last fall, Meta demonstrated its own prototype pair of AR glasses, called Orion, controlled by gestures and a neural-input wristband. At last year’s Augmented World Expo conference for AR, other startups showed similar experiments.

At CES, it became clear that companies are putting a lot of thought into how we should navigate these devices in the future. In addition to the Mudra Link bracelet, I came across a couple of other wearables meant to work with glasses. 

Take the Afference Ring, for example, which applies neural haptics to your finger to provide tactile feedback when using gesture controls. It’s intended for devices like smart glasses and headsets, but I got to try a prototype of it paired with a tablet just to get a feel for how the technology works. 

In one demo, I played a simple mini golf game that required me to pull my arm back to wind up and then release to launch the ball. The more I pulled back, the stronger the haptics on my finger felt. The experience of toggling brightness and audio sliders was similar; as I turned up the brightness, the sensation on my finger felt more prominent.

afference-ring.png

Afference’s ring provides haptic feedback on your finger. 

Nic Henry/CNET

It was a simple demo, but one that helped me understand the type of approach companies may take to applying haptic feedback to menus and apps in mixed reality. Afference hasn’t mentioned any specific partners it’s working with, but it’s worth noting that Samsung Next participated in Afference’s round of seed funding. Samsung launched its first health tracking smart ring last year and announced in December that it’s building the first headset to run on the newly announced Android XR platform for upcoming mixed reality headsets.

The Mudra Link wristband works with the newly announced TCL RayNeo X3 Pro glasses, which are launching later this year. I briefly tried the Mudra Link wristband to scroll through an app menu on the RayNeo glasses, but the software wasn’t finalized yet. 

I spent most of my time using the wristband to manipulate graphics on a giant screen used for demo purposes at the conference. The cello example was the most compelling demo, but I was also able to grab and stretch a cartoon character’s face and move it around the screen just by waving my hand and pinching my fingers.

Halliday’s smart glasses, which were also unveiled at CES, work with an accompanying ring for navigation. Though I didn’t get to try the ring, I used the glasses briefly to translate language in real time, with text translations instantly showing up in my field of view even on the noisy showroom floor.

A woman with red hair adjusts a pair of black framed glasses

The Halliday smart glasses put a small screen in your field of view, and you can navigate the device with a companion ring. 

James Martin/CNET

Without gestures, there are generally two primary ways to interact with smart glasses: touch controls located on the device, and voice commands. The former is ideal for quick interactions, such as swiping through a menu, launching an app or dismissing a call, while the latter is useful for summoning and commanding virtual assistants. 

Gesture controls could make it easier to navigate interfaces without having to bring your hand up to your face, speak out loud or hold an external controller. However, there is still a degree of awkwardness that comes with using gestures to control a screen that’s invisible to everyone but the person wearing the glasses. I can’t imagine waving my hands around in public without any context. 

Meta is already moving toward gesture-controlled glasses, and its CTO, Andrew Bosworth, recently told CNET that gestures would most likely be needed for any future pair of display-enabled glasses.

If CES is any indication, 2025 is shaping up to be a big year for smart glasses — and gesture control will undoubtedly play a role in how we navigate these new spatial interfaces in the future. 

CES 2025: See the 35 Coolest Tech Products We Can’t Shake

See all photos





Source link