Future iPhones Will Have Natural Language UIs, Says Bill Atkinson [Macworld 2011]

By

OLYMPUS DIGITAL CAMERA

SAN FRANCISCO, MACWORLD 2011 — As one of the key architects of the original Macintosh, programming legend Bill Atkinson is in a good position to make sensible predictions about the future of tomorrow’s computer interfaces.

And he says the future of computers is smartphones with natural language interfaces. We won’t be tapping on our iPhone’s screens, we’ll be talking to them in natural language. And they’ll be talking back.

We’ll wear a tiny video-equipped earpiece that will see, hear and record everything we do. On the other end, in the cloud, will be a virtual personal assistant that will act as a cognitive prosthesis.


“I think what’s going to happen will be driven by the mobile market,” said Atkinson during his talk here at Macworld on future UIs. “The UI for mobile will never be keyboard and mouse.”

Instead, we’ll have a phone in our pocket or purse and will wear an earpiece with a microphone, speaker and camera.

The earpiece will allow us to talk to a virtual personal assistant running in the cloud. We will always be having conversations with our virtual PAs, which will keep us well-informed — they’re jacked into the net. The virtual personal assistant will see, hear and record everything we see and hear.

“You point at a building, the camera will see where you’re pointing, and say, ‘That’s Bank of America.'”

Personal memories will be stored and remembered. Every conversation we’ve had, every place we’ve ever visited, will be stored on the cloud.

“A memory prosthesis will be the killer app of these virtual personal assistants,” Atkinson said.

The technology required is deep natural language understanding, not just speech recognition. And natural language understanding depends on knowledge of both language and how the world works.

Atkinson is 100% convinced that natural language is the computer UI of the future. No doubt about it.

“I can tell you for sure, they’re going to happen,” said Atkinson. “I can’t tell you when.”

Nonetheless, Atkinson doesn’t think it will be long. Certainly within 10 years. The best evidence, he says, is the upcoming episode of Jeopardy featuring an IBM supercomputer.

On February 14, IBM’s DeepQA Project Watson will compete on Jeopardy TV show, listening and answering questions in natural language.

Atkinson thinks it will be a watershed moment.

“When we see computers interacting with a natural language interface, people are going to want it,” he said.

“I’d ask you all to watch Jeopardy on February 14,” he said. “It may be a momentous occasion.”