How iPhone 4s and iOS 5 Reveal the Mac of the Future

By

mac-funamizu-mobile-phone-concept8

Planted in your shiny new iPhone 4s and in the iOS 5 are the seeds of tomorrow’s Mac of the future, and indeed the future of all computers. You can find them if you know where to look. (And I’ll tell you where below.)

It’s not supposed to be this way. In the Microsoft world, at least, new technology starts at the top and “trickles down” from bigger and more powerful computers over time to mobile devices and eventually cell phones. If you’re focused on the machines, this makes sense, as larger computers are more capable of handling powerful new features.

But if you’re focused on the user, as Apple is, this approach doesn’t make sense. Apple has developed what I believe is a unique strategy: introduce new interfaces and new ways to interact with computers and the Internet on the smallest devices first, then scale them up over time, eventually ending up as desktop features.

This started with the iPhone.

In 2007, both Microsoft and Apple introduced the foundation of tomorrow’s computers — interfaces that featured multi-touch, physics and gestures (MPG).

Microsoft introduced MPG on big computers because the machine could handle it. Apple introduced MPG on tiny computers because the user could handle it.

Not just the interface, the iPhone ushered in the App Store idea and other innovations that would start on the phone, move up the food chain to tablets and eventually the desktop. OS X Lion, for example, has a touch-like interface, multi-touch gestures and other elements first introduced on the iPhone. The next major generation of iMacs, of course, will be touch screen devices either optionally or exclusively.

This is a brilliant strategy, and I’ll tell you why: People have lower expectations on phones, and are willing to make sacrifices for the sake of mobility.

One controversial aspect of the all-screen cell phone — well, it used to be controversial — is the idea of using an on-screen keyboard instead of a physical one. Had Apple introduced this first on, say, a MacBook, and had replaced the lower half of the clamshell with a software-based touch-screen virtual keyboard, nobody would have bought it. Apple would have been criticized, and the idea of screen-based keyboards would have been set back by a decade.

Instead, Apple did it on the smallest computer — the iPhone. The proposition was that, yes, the keyboard is harder to type on. But in exchange for that sacrifice, we’ll give you a much bigger screen than older generations of phones, without the bulkiness of slide-out keyboard phones.

There was some grumbling, but eventually we all accepted the idea of typing on screens. When on-screen keyboards showed up on iPads, the complaints were fewer.

Apple mainstreamed on-screen touch keyboards by introducing them on phones first.

Still, nobody likes the idea of using only on-screen keyboards on tablets, clamshell laptops and desktops. In fact, that appears to be the main objection to the idea of big-screen touch-based desktop computing.

What’s missing from this analysis is that the touch-screen keyboards of tomorrow will be supplemented by other technologies that both improve the experience of using the touch keyboards, and also reducing the need to type.

In fact, it will be possible to do all your work without typing at all.

Here’s what’s truly exciting: These supplemental technologies were introduced in the iPhone 4s, and also in the iOS 5. Now that they exist, Apple will make them increasingly sophisticated until they become part of the core interfaces for the iMac of the future.

The technologies are: 1) better keyboards; 2) artificial intelligence; and 3) haptics.

1. Better keyboards

Physical keyboards are great. The problem is that they are untethered from Moore’s Law — they don’t get better over time. In fact, the best keyboard every sold, according to many, became available in the 1980s — the IBM Model M keyboard. I personally like the Apple-style flat-key keyboards. But even these represent only a minor improvement on the old-style keyboards.

In fact, physical keyboards are the only element of desktop computers that don’t really improve anymore.

But on the iOS and other platforms that have on-screen keyboards, the keyboards can improve constantly because they’re software.

You see minor improvements to the keyboard experience in the iPhone 4s and iOS 5.

For example, you’ll find a new feature called Shortcuts, which enables you to add your own custom auto-correct words and phrases.

Just find the Keyboard setting under General in Settings. Add your word or phrase, and also the code that triggers it. For example, you could tell the phone that when you type “thnx” to suggest: “Thanks for everything! Talk to you soon.”

This can save you a lot of typing.

Although new to the iPhone, this capability is not all that exciting or ground-breaking. However, it shows one very important aspect of on-screen keyboards: Constantly improving auto-correct can greatly reduce the amount of typing you do.

In the future, auto-correct will become both more “auto” and also more “correct.” Eventually, you’ll have to actually type only a fraction of what is written. Auto-correct will do the rest.

On the iPad, the iOS 5 offers a new keyboard trick: The keyboard can split in two,  so that when typing in landscape mode, you can hold the iPad with two hands while typing with your thumbs.

Software keyboards on desktops will be radically flexible, configurable and customizable, which will make most people actually prefer the on-screen variety.

2. Artificial intelligence

As millions of users are discovering this weekend, Siri artificial intelligence spares you a lot of screen touching. Instead of typing a text, you just say something like: “Text my wife and tell her I’ll be late.” Instead of replying to email, just talk. Instead of writing a long note, just dictate it. Instead of typing a long URL to find, say, my online bio, just say: “Open Elgan dot com.”

If you can imagine an advanced version of Siri, working on a full-powered desktop, you can see how you’d never really have to type anything if you don’t want to.

3. Haptics

The main reason people don’t like on-screen keyboards is that they can’t feel the keys.

Physical keyboards use your sense of touch to learn where the keys are, and also provide feedback on whether or not the keys were pressed.

Surprisingly, on-screen keyboards can do this, too, using haptics.

Of course, all phones have haptics. When you turn the ringer off, your phone will buzz instead of ring. And that’s the most rudimentary kind of haptic feedback.

However, anyone who’s played “Call of Duty” on Microsoft Xbox 360 knows that an enormous amount of hyper-realistic feedback can be achieved through haptics. Xbox controllers enable you experience all manner of violence, from air strikes to bullets to grenades. It’s all quite convincing.

The haptic touch interfaces of the future will buzz and vibrate depending on where you touch. They’ll re-wire your brain and enable you to “touch type” with confidence. You’ll “feel” the edges of the keys, and other interface elements. They’ll “click” when you type.

Apple introduced improved haptics into iPhone 4s and iOS 5, albeit in a typically rudimentary way.

iOS 5 offers custom haptic alerts that can be associated with individual people. So when your sound is turned off, vibrations will not only tell you that you’re getting a call or text, but also exactly who is contacting you.

To use the iOS 5’s custom haptics feature, Select “Accessibility” in the General option of Settings. Turn “Custom Vibrations” on.

Then, go to Contacts, choose a contact and touch Edit.

Tap on the word “Default” next to “Vibration.” You’ll be given the option to choose one of the canned patterns, which include “Alert,” “Heartbeat,” “Rapid,” “S.O.S.” (actually the morse code) and “Symphony,” which vibrates the opening of Beethoven’s 5th Symphony.

The coolest part is a “Custom” option. By choosing that, you’ll be given a fun interface for tapping out your own vibration patterns. When you tap, ripples fly out like on the surface of a pond.

Note that one limitation of this feature is that it won’t tell you if you’ve got a text or a call, only who is trying to contact you.

Over time, all these bare-bones technologies — better keyboards, artificial intelligence and haptics — will become ever more sophisticated as the whole iOS 5-style interface moves all the way up the Apple food chain to the iMac.

Within five years, your Mac will be a giant iPhone, set at a drafting-table angle. You’ll be able to use a physical keyboard if you want to, but you probably won’t.

And the reason is that typing itself will become simultaneously improved for screens, and also less necessary. The seeds planted in the iPhone 4s and iOS 5 will grow into mighty trees, enabling a futuristic computing experience with less pointing, clicking and typing and more touching, swiping and talking.

 

Image, courtesy of Tuvie.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.