Apple unveils ‘visual intelligence’ feature, hinting at future AR glasses revolution

Apple introduced a new feature called ‘Visual Intelligence’ at the iPhone 16 event.

a person holding a phone

Apple’s ‘Visual Intelligence’ feature is exciting and seems to set the stage for future AR glasses. Allowing users to scan and identify objects, copy text, and gather information on the go gives them a glimpse into what could be an integral part of AR glasses.

The idea of using AR glasses to receive real-time information about your surroundings without taking out your phone is very appealing. It could be a significant advantage if Apple successfully integrates Visual Intelligence into future AR glasses.

Given that Apple is known for refining technology before launching it, the Visual Intelligence feature on the iPhone could be an essential part of a broader strategy for AR. It’s a smart move to build and perfect this technology now so that when AR glasses do arrive, they can offer a seamless and polished experience.

The potential for AR glasses is enormous. Other companies like Meta and Google have already invested in this space, so Apple will need to ensure they can compete with a standout product. Hopefully, by the time those glasses are ready, Visual Intelligence will be a well-developed feature that enhances the overall user experience.