“Apple is all-in on augmented reality. But where will it lead?” That’s a pretty standard view of Apple’s experiments with visual AR, aka overlaying virtual objects onto the real world, via the iPhone’s camera and screen.
But Apple is already providing a fully immersive AR overlay onto the real world, to millions of users — only it isn’t using a screen. AirPods are augmented reality. They are also a part of a new computing paradigm that Apple snuck into the world without telling anybody. This paradigm currently consists of AirPods, iPhone, Apple Watch and the HomePod. And it is as discreet and low-key as it is important.
PAN: The ultimate in personal computing
There’s an established concept known as the PAN, or personal area network. According to Techopedia, the term refers to “the interconnection of information technology devices or gadgets within the environment of an individual user (typically within 10 meters or 33 feet).” That’s Bluetooth range. PANs include all kinds of devices, including laptops. But today I want to focus on the iPhone, the Apple Watch and AirPods and how they fit into Apple AR.
The Apple PAN
Many people already use these three Apple devices in combination every day. They carry the brain (the iPhone) in their pocket or bag, a secondary screen on their wrist, and the speakers in their ears. The tasks you can accomplish with this setup aren’t much different from what you can do with an iMac, but because of the physical setup, there’s one huge difference — the PAN is part of the world around you, and not contained inside a glass and aluminum box.
This is Apple augmented reality, and it comes from two linked directions. One is that your devices are aware of the world around you. Your iPhone and Apple Watch both know exactly where you are. They know which direction you’re pointing in. Between them, they know if you are in a noisy or a quiet environment (assuming you’re wearing a recent Apple Watch with the Noise app in the latest version of watchOS). These devices are connected to the internet, so they know if it’s raining, or hot, or if you’re stuck in traffic.
The Apple Watch also measures your heart rate, your step count, how well you sleep, and whether you spend the whole day slumped in a chair.
Apple ambient computing
The other direction is the one that gives you, the user, information. Ever since I got my Apple Watch, my phone stays in my pocket most of the time. If I’m on my bike, I can get spoken directions through one AirPod (wearing two while riding seems too dangerous to me). In i0S 13, Siri can read out incoming messages automatically. When following map directions, our watches use coded haptic taps to tell us whether to turn left or right. At any time, you can glance at your wrist to see instantly relevant data. If you’re still confused by, say, the map directions, you just pull out your iPhone and use that.
By making our devices ultra-portable — and connecting them so deeply that they act more like one device with multiple parts than multiple devices networked together — Apple stealthily encompassed us in wearable, full-body computers. It’s similar to how cellphones led to about half of all humans carrying powerful pocket computers 1.
Apple AR: A device for every task
These days, I only pull out the iPhone for photos, to write more in-depth messages than the watch allows, or to read something while I’m in the metro, on the bus or wherever.
If I really want to do some serious reading, or writing, or more in-depth photo editing, I’ll use an iPad. And for a few tasks, the Mac is still a better fit.
The Apple Watch fits into this scenario in the exact same way. A wrist-mounted computer is better at certain things, and worse at others. You’d never edit a photo on an Apple Watch, any more than you would want to monitor your heart rate with an iPad, or use your MacBook to Apple Pay for your groceries in the supermarket. As these devices improve, the range of tasks shifts — the iPad is now a credible MacBook replacement for many people — but the hierarchy of suitability remains.
As the Apple Watch gets more powerful, and more independent, perhaps you’ll only need your iPhone for tasks that require a bigger screen or a better camera. AirPods could eventually assume some of the roles of the watch, streaming audio directly from the internet or running audio versions of the Reminders and iMessage apps, for example. And surely Apple will add new devices to the Apple PAN.
And all the while, some features flow between devices, depending on context. Notifications appear on the most relevant device. Apple really nailed this part of the experience. Alerts and notifications almost always appear on the correct screen (or the correct speaker).
Which brings us to one big question…
Where does visual augmented reality fit into Apple AR plans?
It’s pretty clear that Apple is working on visual AR. Every keynote brings a new demo featuring games that project objects onto the real world. And Apple’s ARKit — the suite of tools that let developers more easily add augmented reality to their apps — continues to become ever more impressive. The iPhone and iPad can render color, lighting and shadows in real time, so any projected object really does look like it exists in real space.
But why? It looks good, for sure, but so far we’ve seen no killer use for it. Maybe Apple will make glasses. But who would wear them? A watch is one thing, but Apple Glasses? And for AR, those glasses need cameras. That’s going to be a hard sell, privacy-wise.
I don’t see it. But neither do I have any idea why Apple would be so focused on great AR unless it had something else coming. Ambient visuals, projecting information over the real world, is a clear next step for ambient computing, but how will it work? I’m excited to see how Apple manages to surprise us here. Maybe there’s some amazing new Apple AR application that nobody else thought of yet.
If it’s just a pair of gimmicky AR glasses that show Pokémon scampering about city streets, I’m not buying it. But if we’ve learned anything over the years, it’s that Apple knows how to surprise us. And if AR really is as important to Apple as it seems, we’re in for something both subtle and astonishing.
- Worldwide, there are 7.7 billion cellular subscriptions. According to the same source, 3.6 billion people have internet connections. An estimated 5 billion people have mobile devices, of which roughly half are reckoned to be smartphones. ↩