The Apple iWatch and Google Glass are both coming soon, apparently.
We don’t have all the details on either product. And we can’t even be 100% sure that the Apple wristwatch is going to happen at all. But most knowledgeable tech fans are expecting both and looking forward to seeing, buying and using them.
Excitement is warranted. No, I mean serious, pure geek joy is definitely called for. But not because of the iWatch and Google Glass products themselves.
There’s a much, MUCH bigger reason to be excited.
The single most culture-changing moment in the history of consumer electronics happened on June 29, 2007.
That day, of course, was the day the Apple iPhone first shipped.
It’s hard to imagine the world before the iPhone. More than 99.99% of the population had never seen a multi-touch phone, multi-touch tablet, multi-touch eReader or any other mobile device with on-screen keyboards and buttons, apps stores, touch gestures and all the rest.
Today, nearly all major phones are multi-touch devices, and the tablet market is dominated by multi-touch tablets, which are now selling in unit sales numbers equalling 50% of the PC market. Gaming controllers, dedicated digital picture frames and even car dashboards are coming out with multi-touch gesture control. Microsoft Windows is now a hybrid desktop and multi-touch operating system. And OS X is multi-touch happy, with Magic Trackpad gestures that mimic the iPad’s and an interface that is clearly evolving toward a touch interface.
Understand that none of this would exist if Apple hadn’t launched the iPhone.
Research for multi-touch devices had been going on since the 1980s and possibly before. University and computer research labs had been developing multi-touch user interface ideas for many years before the iPhone hit.
Commercially available multi-touch devices existed long before the iPhone, including the Microsoft Surface table (the original product that is now called the Pixelsense.
In fact, Apple wasn’t even first to market with a multi-touch mobile phone. The LG Prada KE850 was being demonstrated before Steve Jobs even announced the iPhone.
But all this research and both these products didn’t do jack squat for the vast majority of gadget users. All this multi-touch activity was taking place behind closed doors and on the fringes of the market. Phones, including Android phones, had tiny screens and physical keyboards, and the user interfaces worked more like PCs with icons and menus and file management. This is what mobile phones looked like before Apple mainstreamed multi-touch.
Apple didn’t invent the multitouch device — not by a long shot. They did invent something far more important — the multitouch device market.
Through Apple’s unique ability to make thousands of really great decisions over design, engineering and marketing, they brought the multi-touch user interface out of the labs and out of the shadows and into the bright light of mainstream, everyday life.
Today, the iPhone is a great phone, and an important one. But the iPhone by itself is unimportant in comparison to the multiple industries that have been transformed by Apple’s mainstreaming of multi-touch — including the phone industry.
The world has been transformed in favor of multi-touch interfaces thanks to the iPhone. But if you hunted down and destroyed every iPhone, that world would still be transformed.
So here comes the Apple iWatch and Google Glass
Geeks who pay attention to product rumors and alpha products are really pumped about Apple’s coming wristwatch and Google’s in-developer-preview augmented reality glasses, for lack of a better term. (We learned this week that Google Glass will be compatible with the iPhone.)
These very different products both fall into the “wearable computing” category.
Like the multi-touch interface, “wearable computing” technology has been in development for literally decades, in university and industry labs. “Wearable computing” gadgets, too, have been on the market, but like multi-touch phones before the iPhone, on the fringes where the larger culture is unaffected.
Dozens or hundreds of companies have tried to get a significant number of buyers and users excited about their wearable computing products and without success.
The probability of an Apple wristwatch and the apparent certainty of Google Glass means something much more than two new toys for geeks to play with. It means hundreds of new toys.
It means wearable computing is about to go mainstream.
We’re on the brink of a world in which not only does Apple sell a smart watch that works with the iPhone and probably other Apple devices, dozens or hundreds of companies will make such watches.
Not only will Google sell augmented reality glasses, but many companies will.
And, more importantly, a majority of consumers will buy augmented reality glasses. They’ll also buy wearable computing prescription glasses, wearable computing shoes, wearable computing necklaces, wearable computing shirts.
To me, the gadgets themselves are nice, but sauce that makes them possible — stuff like Siri, Google Now and a hundred other supporting software and online innovations — is what will really change everything.
It’s these supporting technologies that will let us simply talk without pulling a device out of our pockets and get answers, directions, information and, eventually, advice, insights and suggestions. It’s the artificial intelligence on a remote server somewhere and a new, now-unimaginable industry in wearable computing services that will make the iWatch and Google Glass awesome. And those same technologies will make many other products awesome.
The thing is, they’ll make life awesome.
The mainstreaming of wearable computing is a transformative event in the history of human culture. It means a far more seamless integration between man and machine.
Mainstream wearable computing means we will upload our personal memories to the cloud to be retrievable instantly. It means the ability to “lifelog” — to record every moment of every day, plus the ability to rewind and have instant reply of our personal experiences and literally photographic memory. It means the ability to share your personal experiences (sites, sounds and more) with others in a way that makes them feel like they’re experiencing it, too. It means the end of knowledge as something valuable.
All this will happen eventually. And when it does, it will be trivial to trace back this transformation to the launch of the Apple wristwatch and Google Glass — the two products most likely to flip that switch and mainstream wearable computing devices.
So let’s all get excited about the coming ability to buy wristwatches from Apple and glasses from Google.
What really matters isn’t so much two awesome new products. What really matters is the amazing new world these products will bring into existence.