Real-time guidance for visually impaired users

NaviSense uses AI to help visually impaired users find objects in real time, offering vibration and audio guidance for a more intuitive experience.

NaviSense uses AI to help visually impaired users find objects in real time, offering vibration and audio guidance for a more intuitive experience.

Researchers at Penn State have developed a smartphone application, NaviSense, that helps visually impaired users locate objects in real time using AI-powered audio and vibration cues.

The tool relies on vision-language and large-language models to identify objects without preloading 3D models.

Tests showed it reduced search time and increased detection accuracy, with users praising the directional feedback.

The development team continues to optimise the application’s battery use and AI efficiency in preparation for commercial release. Supported by the US National Science Foundation, NaviSense represents a significant step towards practical, user-centred accessibility technology.

Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!