Mobile menu toggle

How Apple’s Visual Intelligence could change everything

By

iPhone 16 using Visual Intelligence
iPhone 16 using Visual Intelligence.
Photo: Apple

Apple CEO Tim Cook has a well-established habit of dropping subtle hints about where the company is headed. This time, the dropped breadcrumbs all point toward Visual Intelligence. And the impression they leave is of a company preparing to reshape how humans interact with the world around them.

How Apple Visual Intelligence could change everything

Cook established a track record for dropping hints years ago. In 2013, he talked up the coming explosion in sensor technology — well before the Apple Watch first came out. And before the Vision Pro launched, he spoke extensively about the promise of augmented and virtual reality. Now, he appears to be doing it again regarding Visual Intelligence.

Visual Intelligence is Apple’s AI-powered feature that lets devices “see” the physical environment and respond to it intelligently. Currently available on iPhone 16 Pro and newer models (and iPhone 15 Pro models via iOS 18.4), it allows users to point their cameras at objects, text or places and get contextual information in return — essentially a reverse image search and AI query system powered under the hood by OpenAI’s ChatGPT and Google. It can read and summarize text, identify objects, translate languages and more.

But as Bloomberg‘s Mark Gurman reported in his Power On newsletter Sunday, Apple is actively developing its own visual models to reduce its dependence on third-party services. The implications of that shift — and the hardware Apple is planning to pair it with — could be significant.

Tim Cook’s not-so-subtle hints

Apple CEO commits to 'personal and private' AI for users
Apple’s CEO is “very happy with the collaboration with Google” on AI.
Image: Apple/Cult of Mac

Cook has singled out Visual Intelligence in two high-profile places. On Apple’s holiday quarter earnings call, he called it one of Apple Intelligence’s most popular features, describing it as something that “helps users learn and do more than ever with the content on their iPhone screen.” He then brought it up again at an all-hands meeting with employees, touting Apple’s 2.5 billion-device installed base as a “huge advantage” in AI.

Gurman sees a clear pattern. He notes that Cook wouldn’t be championing a feature so publicly if Apple weren’t planning to significantly accelerate work in that area. The precedent supports that view. Cook’s early enthusiasm for sensors preceded the Apple Watch, and his AR commentary foreshadowed the Vision Pro.

Visual Intelligence, it seems, is the prelude to something much bigger.

3 new wearable devices

Apple is now accelerating development of three new wearable devices as part of a shift toward AI-powered hardware, according to Bloomberg: smart glasses, a pendant that can be pinned to a shirt or worn as a necklace, and AirPods with built-in camers for expanded AI capabilities. All three are being built around a smarter, more capable version of Apple’s Siri voice assistant that relies on visual context to take action.

Smart glasses

Apple Smart Glasses concept
Apple reportedly placed a high priority on bringing smart glasses to market.
AI concept: ChatGPT/Cult of Mac

Smart glasses are the most ambitious of the three. Apple’s smart glasses will feature an advanced camera system — a high-resolution camera for capturing photos and videos, and a second camera that provides visual information and environmental context to Siri.

The glasses will support interacting with Siri, making phone calls, listening to music, taking photos and capturing video. Users will be able to look at an object and ask questions about it, and get detailed navigation directions while walking. Unlike Meta’s Ray-Ban smart glasses, which use a single camera that switches between functions, Apple will offer dedicated lenses for each function — a distinction Apple employees view as a key differentiator. Smart glasses production could begin as soon as the end of this year, with a 2027 launch.

AI pendant

Illustration of a man wearing Apple smart glasses, AirPods with cameras and an AI pendant, all rumored to be coming soon.
Apple’s got an ambitious plan for new AI wearables.
Illustration: Midjourney/Cult of Mac

A possible AI pendant is perhaps the wildest entry in the lineup, and also the most intriguing. It will function as an always-on camera for the iPhone that also includes a microphone for Siri input, with some Apple employees already calling it the “eyes and ears” of the phone.

Unlike the ill-fated Humane Ai Pin — which tried and failed to replace the smartphone entirely — Apple’s pendant will serve as a companion device. It will offload processing to the iPhone rather than functioning as a standalone device.

Gurman makes the distinction clear: The Humane AI Pin didn’t fail because the form factor was inherently bad. It failed because its AI was slow, its battery life was poor, and it overreached by trying to replace a device people love. Apple won’t make that mistake.

Camera-equipped AirPods

AirPods Pro 4 cameras could see around you
Next-gen AirPods Pro could come with built-in cameras that see around the user.
Photo: Apple

Beefed-up AirPods round out the trio of Apple AI devices, and could be the product that brings Visual Intelligence to the widest audience first. A version with cameras is expected to arrive late in 2026.

Those cameras would be low-resolution or infrared, intended less for photography and more for giving Apple Intelligence a view of the world. Given how many people already wear AirPods for hours each day, these could be the most practical near-term entry point for persistent visual AI.

What could it actually do?

The practical applications of Visual Intelligence range from the mundane to the genuinely transformative. At the simple end, you might point your glasses or pendant at a plate of food and instantly get a breakdown of the ingredients and nutritional content. A step further, and you could receive turn-by-turn walking directions that reference real-world landmarks — “turn left at the coffee shop on the corner” — rather than abstract distances.

The technology could also surface context-aware reminders. You could walk up to your car and be prompted to check your tire pressure, or enter a grocery store and be reminded of what you need.

For people with visual impairments, the possibilities go deeper. Gurman flagged Meta’s reported plans to add facial recognition to its smart glasses, noting that while privacy concerns are real, properly implemented recognition of people within your own contacts could be a genuine win for accessibility.

The road ahead isn’t smooth

Don't get your hopes up for AI-enabled Siri in iOS 26.4
The wait for AI-enabled Siri could stretch past iOS 26.4.
Image: ChatGPT/Cult of Mac

Real technical and software challenges stand in the way of Apple achieving its vision for ambient AI. Miniaturization remains a constant constraint — fitting cameras and the necessary electronics into AirPods or a lightweight glasses frame is no small engineering feat.

And all three devices are ultimately dependent on a next-generation Siri that Apple has not yet delivered. The more advanced chatbot version of Siri won’t come until iOS 27, and will rely on Google-developed AI models.

There’s also the question of privacy. Cameras embedded in everyday wearables — glasses you wear all day, earbuds, a pendant around your neck — raise legitimate questions about consent and surveillance that Apple, regulators and the public will all need to grapple with.

A third act in the making

Among new Apple hardware categories since the iPhone’s release, Apple Watch succeeded spectacularly and has become a genuine health device over time. Vision Pro, however, still searches for an audience. The AI wearables now taking shape appear to represent another new category for which Cook has been quietly, methodically laying the groundwork.

Whether smart glasses, camera AirPods and an AI pendant resonate with consumers the way Apple Watch eventually did is impossible to say at this stage. But the strategic logic is clear. With more than 2.5 billion active Apple devices in users’ hands, and a growing suite of AI capabilities, the company has both the infrastructure and the incentive to put intelligence into everything people wear.

Visual Intelligence — today a relatively modest feature that leans on other companies’ technology — is the seed of that ambition. What Apple grows from it is the story worth watching.

Comments

Your email address will not be published. Required fields are marked *

  • Subscribe to the Newsletter

    Our daily roundup of Apple news, reviews and how-tos. Plus the best Apple tweets, fun polls and inspiring Steve Jobs bons mots. Our readers say: "Love what you do" -- Christi Cardenas. "Absolutely love the content!" -- Harshita Arora. "Genuinely one of the highlights of my inbox" -- Lee Barnett.