How Apple Made the World Safe for the Future of Keyboards

By

IronMan_Keyboard

It’s hard to recall now, but the number-one complaint about the iPhone when it first came out was the on-screen keyboard.

Engadget’s Ryan Block asked: “Will the iPhone be undone by its keyboard?” People talked about how on-screen typing would destroy the iPhone in the same way that the hand-writing recognition system helped kill the Newton.

Even more incredibly, one of the main iPad criticisms when it first came out was the visibility of finger smudges on the screen when you turn the power off.

These concerns seem quaint now, textbook examples of the limited human-ape mind trying to grapple with novelty. It’s like people complaining about their new “motor car” a hundred years ago by saying the infernal contraption fails to slow down when they say, “whoa, Nellie!” and won’t speed up when they whip the fender with a riding crop. “It’ll never catch on!”

Many annoying tech pundits (including and especially Yours Truly) bitched and moaned about Apple’s global ban on the sale of third-party physical keyboard and refusal to create one of their own.

I believe Apple deliberately used its red-hot iPhone product to force the world to accept and learn to appreciate on-screen keyboards, and break them of their physical keyboard habit. When Apple released the iPad a year ago, it was usable with two Apple keyboards (the standard Bluetooth keyboard and a new cradle keyboard). But no matter. The on-screen keyboard idea had already been accepted by a critical mass of users.

Despite widespread acceptance, people are still divided on whether on-screen keyboards are good or bad, and most still prefer a physical keyboard. But let’s look at the big picture.


In the PC world, old-school purists say the IBM Model M keyboard, released in 1984, is the best keyboard ever built. Others like the original Macintosh keyboard, which also came out that year. Personally, I like the flat, square keyboards found on Apple systems and Sony laptops nowadays. But no matter what your preference, you have to admit that keyboards haven’t come very far in the past quarter-century.

Think about screen resolution, processor performance and other aspects of personal computers. Every other part of the PC has improved by orders of magnitude, while the average keyboard available for both desktop and laptop systems has actually declined in quality.

The evolution of keyboards essentially stopped decades ago, having reached the limits of what’s possible with springs, plastic, wire and the mechanics of the human hand.

But on systems that use on-screen, software-based keyboards, evolution has begun again.

One exciting option with software keyboards is the ability to make subtle changes depending on application. One limited version of this capability is found on the iPad. When you select the address bar in the Safari browser, you get a helpful .com button (pressing it gives you all four characters). In the future, we’re likely to seen all kinds of buttons coming and going depending on what we’re trying to do – and even complete changes to the keyboard layout.

Another useful direction is the growing availability of alternative keyboards. One small example shipped today from a developer named Tal Shumski. His 99-cent Keyboard 2 app gives you a keyboard that takes over the entire iPhone screen. The keys are bigger, and more widely spaced, which improves typing speed and accuracy. The text of what you’re typing is overlaid on top. After you’ve typed what you want, you pick the application (Facebook, e-mail, Google Search, etc.).

You may like Keyboard 2, or you may not. The point is: Now that keyboards are software, we’ve all got a universe of personal choice in how our keyboards look, feel and function.

These small advantages are nice, but the real power of on-screen keyboards is yet to come.

I don’t think there’s any question that, for most people, onscreen keyboards are slower than physical ones, and that may always be true — sort of.

The power of on-screen keyboard evolution is that the giant-iPad desktop computers of the future will combine custom keyboards, gestures, haptics and other input (voice, in-air gestures, etc.) to speed up not only typing but all aspects of computing. So while straight-up typing may always be slower on a screen, overall computing may be faster because the keyboard will no longer be a physical, mechanical contraption separated from the location of other input.

One immediate improvement is that the typing we now do with keyboards and the pointing and clicking we now do with mice or track pads will all take place in the same space, and become one high-speed set of gestures. Both Apple and Microsoft have multiple virtual keyboard patents, which revolve around the computer being aware of user hand position and other contextual cues. Microsoft, for example, holds a patent that places the keyboard anywhere your fingers are, so the keyboard is always in the right place.

Another point to remember is that on-screen keyboards work better with auto-correct systems than physical keyboards to, because the interaction with the auto-correct system can be integrated with the typing movements better. Auto-correct can more than compensate for errors on an on-screen keyboard.

The biggest problem with on-screen keyboards, of course, is the lack of tactile feedback. Users are shy about attempting to touch-type on an iPad, for example, because they can’t feel where the keys are. (You should get in the habit of iPad touch typing anyway if you want to type really fast.) But all that will change soon. We’re currently on the brink of a revolution in haptic feedback, which computers with on-screen keyboards will use as a substitute for physical feedback.

If you think haptics are about zaps that like generic buzzing, think again. The new haptics are rich, detailed and convincing. I’ve demoed systems that are so amazing they actually simulate the texture of moving objects.

One demo by a company called Immersion showed how all the feelings of a pinball game could be programmed into a software game. You actually feel the metal of the ball, and swinging of the paddles.

Touch-screen devices of the future will simulate the feel of running your fingers across physical keys. Specific “anchor keys” will feel different from the others, so you’ll be able to run your fingers across the screen to feel where the hand-placement goes.

“High definition” touch-screen haptics are already available on the Android platform before iOS; Immersion released a haptics SDK for Android recently that will enable handset makers and app developers to advance the use of haptics. Nobody knows when Apple will support rich haptics – iPhone 5, maybe?

The thing to take away from all this is that it hardly matters whether today’s on-screen keyboards are better or worse than physical ones. The important thing is that keyboards are software now, and therefore evolving rapidly along with other software-based technologies. It’s only a matter of time before all aspects of the typing experience are better on screen than on physical keyboards.

Our kids and grandkids will look back at keyboards from the 80s, 90s and 2000s the same way the iPod generation looks at Depression-era radio.

But this transformation would not have begun when it did without Apple’s visionary strategy of forcing the world to accept on-screen keyboards with the iPhone in 2007. Apple crammed on-screen keyboards down our throats, and I complained about it. But I was wrong. Training the world to accept touch screens was the beginning of the software keyboard revolution.

(Photo courtesy of Paramount Pictures and Iron Man II)

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.