Meta’s smart glasses with AI: More hype than help

Interacting with an AI assistant through smart glasses feels like stepping into the future, but for Meta’s Ray-Ban glasses, the promise of Live AI is marred by limited practicality and an awkward learning curve.

Meta’s Hypernova smart glasses aim to blend fashion with advanced tech, featuring a built-in display and gesture control.

Meta’s new Ray-Ban smart glasses, featuring a Live AI assistant, promise a futuristic way to interact with the world. Users can ask questions about their surroundings, with the AI offering answers in real time. From recipe ideas to decorating advice, Live AI aims to be a virtual assistant that sees what you see and responds conversationally.

Despite its intriguing potential, Live AI struggles in everyday use. Its responses often state the obvious, like suggesting scrambled eggs when a fridge has two eggs and no milk. Users also find it challenging to remember to use the feature, with a smartphone search frequently feeling more practical and efficient. Moreover, the AI’s suggestions often lack the depth needed to be genuinely useful.

Making Live AI effective requires users to master the art of asking precise, specific questions a skill that doesn’t come naturally to everyone. This, combined with issues like misinterpreting conversations and a short battery life, makes the technology feel less magical in real-world scenarios. While the glasses point to a vision of hands-free AI, they currently struggle to provide a compelling alternative to existing devices.