Apple plans to add cameras to future Apple Watch

Visual Intelligence could allow Apple Watch and AirPods to identify locations, extract event details, and offer real-time insights using AI-powered camera technology.

The Apple Watch Series may feature a display-integrated camera, while the Ultra model could have one near the digital crown for AI-driven functionality.

Apple is reportedly planning to introduce cameras to its Apple Watch lineup within the next two years, integrating advanced AI-powered features like Visual Intelligence.

According to Bloomberg’s Mark Gurman, the standard Apple Watch Series will have a camera embedded within the display, while the Apple Watch Ultra will feature one on the side near the digital crown.

These cameras will allow the smartwatch to observe its surroundings and use AI to provide real-time, useful information to users.

Apple is also exploring similar camera technology for future AirPods, aiming to enhance their functionality with AI-driven capabilities.

The concept builds on the Visual Intelligence feature introduced with the iPhone 16, which allows users to extract details from flyers, identify locations, and more using the phone’s camera.

While the current system relies on external AI models, Apple is working on its in-house AI technology, and it is expected to power these features by 2027, when the camera-equipped Apple Watch and AirPods are likely to be released.

The move is part of Apple’s broader push into AI, led by Mike Rockwell, who previously spearheaded the Vision Pro project.

Rockwell is now overseeing the upgrade of Siri’s language model, which has faced delays, and contributing to visionOS, the operating system expected to support AI-enhanced AR glasses in the future. Apple’s increasing focus on AI suggests a shift towards more intelligent, context-aware wearable devices.

For more information on these topics, visit diplomacy.edu.