Meta adds live translation to Ray-Ban smart glasses

Ray-Ban glasses can now interpret visuals and offer context-aware answers with the help of Meta AI.

Meta AI, Ray-Ban, live translation, languages

Meta is enhancing its Ray-Ban smart glasses with new capabilities, including live language translation and broader AI integration.

Users can now access real-time translations in English, French, Italian, and Spanish, even without internet access if language packs are downloaded beforehand. The feature is available across all markets where the glasses are sold, aiming to break down communication barriers during travel and everyday use.

In the United States and Canada, Meta AI is gaining powerful new functionality, allowing the assistant to interpret visuals from the glasses’ built-in camera. This update lets users receive context-aware responses, such as identifying landmarks or translating menus.

Previously in beta testing, the ‘see what you see’ feature is moving to a full release. Messaging tools are also expanding, enabling communication via Instagram, in addition to WhatsApp, Messenger, and native SMS.

Aesthetic updates include new colour and lens options for the Skyler frame. Music lovers will benefit from broader support for streaming services like Spotify, Apple Music, and Shazam in more countries.

Meta also confirmed upcoming launches of the glasses in Mexico, India, and the UAE, while EU users will soon gain access to Meta AI’s visual search tools.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!