How This Guy Is Making Your iPhone Virtually Human

How This Guy Is Making Your iPhone Virtually Human

Today, your iPhone is a gadget, a mere consumer appliance. But your future iPhone will become increasingly human. You’ll have conversations with it. The phone will make decisions, prioritize the information it presents to you, and take action on your behalf — rescheduling meetings, buying movie tickets, making reservations and much more.

In short, your iPhone is evolving into a personal assistant that thinks, learns and acts. And it’s all happening sooner than you think, thanks to the guy pictured above.

The history of interface design can be oversimplified as the application of increasing compute power to make computers work harder for human compatibility.

In the beginning, humans did all the work because computers weren’t smart enough to handle human language. Code had to be punched in. Output had to be translated.

When computers became more powerful, they could interact with users with something similar to human language via a command line interface. Later, PCs and workstations were re-imagined to handle icons, windows, trash cans and dragging and dropping and all the rest.

As Moore’s law continues to deliver ever more powerful processors at lower cost, that available power can be applied to the creation of even more human-compatible interfaces that include touch, “physics,” gestures and more.

Ultimately, however, human beings are hard-wired to communicate with other people, not computers. And that’s why the direction of interface design is always heading for the creation of artificial humans.

There are four elements to a machine that can function like a person: 1) speech; 2) decision-making algorithms; 3) data; and 4) “agency,” the ability to act in the world on your behalf.

Apple already has rudimentary technology and partnerships to achieve all this. The initial launch of the iPhone 5 will likely offer a small sampling of this technology, and subsequent releases of iOS will dribble out more. Eventually, your iPhone will function as a virtual human that you can talk to, that suggests things to you, that you can send running on errands.

When you get your shiny new iPhone 5, you’ll notice a new feature called the Assistant. This feature won’t be an app, but a broad capability of the phone that combines speech, decision-making, data and agency to simulate a virtual human that lives in your pocket.

For speech, Apple has maintained a long-standing partnership with the leading company. A version of iOS 5 with Nuance Dictation has reportedly been sent out to carriers for testing.

For decision-making algorithms, Apple can rely on the amazing technology it purchased in April, 2010, when it bought Siri, a company that created a personal-assistant application that you talk to, and it figures out what you want.

Many iPhone users don’t know this, but Siri is still available free in the app store. You should be using this every day. You talk to Siri in your own words, and Siri’s algorithms figure out what you mean. It sorts through a large number of possible responses, and chooses the best one with uncanny accuracy.

For data, Apple can use your own behavior with the phone, which can inform the system  where you are, where you’ve been, what your preferences are, what your schedule is, who you know, and much more. Solid rumors suggest deep integration of the Assistant with Calendar, Contacts, E-mail and more.

And for agency, Apple can rely on the technology and partnerships developed under the Siri project. The existing Siri app leverages partnerships with OpenTable, CitySearch, Yelp, YahooLocal, ReserveTravel, Localeze, Eventful, StubHub, LiveKick, MovieTickets, RottenTomatoes, True Knowledge, Bing Answers and Wolfram Alpha.

These services both provide judgement (you can say, “find me a GOOD Mexican restaurant”) and agency (you can say, “make me a reservation.”) Siri is designed to actually book you a hotel reservation, buy movie or concert tickets and much more.

There is absolutely no question that Apple is getting into the virtual human racket, starting with the iPhone 5 and iOS 5. But Apple is a consumer electronics company. What the hell does Apple know about artificial intelligence?

The answer may shock you.

The iOS ‘Assistant’ is US Military Technology

The most expensive, ambitious and far-reaching attempt to create a virtual human assistant was initiated in 2003 by the Pentagon’s research arm, DARPA (the organization that brought us the Internet, GPS and other deadly weapons).

The project was called CALO, for “Cognitive Assistant that Learns and Organizes,” and involved some 300 of the world’s top researchers.

CALO’S mission, according to the Wikipedia, was to build “a new generation of cognitive assistants that can reason, learn from experience, be told what to do, explain what they are doing, reflect on their experience, and respond robustly to surprise.”

All this research was orchestrated by a Silicon Valley company called the Stanford Research Institute (SRI). The man in charge of the whole project was a brilliant polymath who worked as senior scientist and co-director of the Computer Human Interaction Center at SRI, Adam Cheyer (pictured above).

Cheyer is not only one of the world’s most renowned artificial intelligence scientists, he’s also one of the leading experts in distributed computing, intelligent agents and advanced user interfaces.

Cheyer now works as a director of engineering for Apple’s iPhone group. He also manages the awesome team he assembled for Siri. Together, these brilliant minds are inventing the future of cell phone interaction.

The iPhone 6 Virtual Human

Here’s how your iPhone 6 will probably work. When you want to make a call, search the web or send an e-mail, you’ll just hold the phone to your ear and say a command.

The phone will recognize your voice, which both authenticates your identity and enables the phone to cater specifically to your needs, your data and your vocabulary.

When you want to do the town, you’ll say things like “make me a reservation at a good Italian restaurant.” and “buy me tickets for a good movie for after dinner.” The phone will do your bidding.

Before meetings, your phone with alert you. You’ll listen, and the phone will give you a briefing about the people you’re meeting, with history and personal information so you’re prepared. If you want to reschedule the meeting, the phone software will interact via e-mail with the people you’re meeting with, finding a new time when all are available, and adding the new meeting time to your calendar.

Your iPhone will make suggestions about gifts, books, music and upcoming concerts. It will learn from your actions, getting better over time.

In short, your phone will evolve from a gadget to a virtual human, one you can talk to and that talks back. It will anticipate your needs, make decisions, and act on your behalf.

That’s the iPhone 6. Exactly how much of this virtual assistant technology will show up in the iPhone 5 is still essentially unknown. But what we do know is that Apple is definitely headed in this direction.

Are you ready to carry a virtual human in your pocket?

(Picture of Adam Cheyer courtesy of Tom Gruber.)

Related
  • Fernando Teixeira

    i want it

  • bigmanbigdreams

    beautifully written…

  • atimoshenko

    Could not disagree more. We communicate with each other the way we do (talking combined with vista cues) because that is the best we can do, given our biological constraints on both transmitting and taking in information. If people were optimised for communication with other people, we would all be telepaths. Moreover, there is nothing ‘natural’ about language either – it is a convention that takes us ~15 years of constant practice to master. If you want an example, look at the touch-screen ordering devices being rolled out by McDonald’s and other fast food chains. They make the process of ordering drastically more efficient than having to chat with a human at the counter.

    As a result, trying to create a virtual human is a terrible idea. We should not be using new capabilities to try to recreate old ways of interaction. Speaking with a computer the same way one would speak with a human is about as sensible as writing on a computer the same way one would write on a piece of paper. Instead, we should be deploying new capabilities to invent NEW methods of interaction.

  • iMan

    One of your best pieces Mike!

  • Site7000

    You are missing the fact that it’s humans who need the language interface. Touch screens are useful in some situations, but not most. Unless you can upgrade humans to have more efficient communications (and that ultimately requires it be hands free; hence your reference to telepathy), you are stuck with training machines to communicate the way humans do. 

  • Mike Sauer

    Sounds exciting. Except I feel like..unless it’s very accurate, it would probably become a nuisance at times. Nuance. Nuisance. …:p just playing.

  • Tash Wahid

    Brave new world.

  • joelengel

    I don’t know what native speech-recognition software Google is using, but I’ve been communicating with it verbally with near flawless performance in the two years I’ve had my Droid.  It works far better–that is, with higher accuracy–than the Nuance software on my iMac.  I use an app called Voice Search to get directions, send a text or email, open other apps, find something on Google, etc.  In fact, the voice recognition is the reason I didn’t move to an iPhone after Verizon opened up. 

  • cliqsquad

    bring. it. on

  • Chris

    just hope this works in all languages. German comuter voices usually sucks. I don’t know how it’s with speech recognition

  • Jreed1235

    Skynet… 

  • prof_peabody

    This is mostly poppycock.  

    For any machine to be truly useful as an active “assistant” it has to achieve Artificial Intelligence or “AI”.  AI unfortunately, is actually quite impossible.  

    There has *never* been a machine that “thinks” or even one that comes close to it.  Most of the greatest minds of our time (not those pushing AI of course), believe that it’s essentially *impossible* for a machine to *ever* be aware or “think.”  No scientist has ever demonstrated even a coherent *theory* for making a machine “think.”  

    It just ain’t gonna happen (despite all the sci-fi movies we know and love).

  • prodrive1

    One of your best 

  • Site7000

    Absolutely astounding string of absolutes, Professor. First you claim “any” active assistant “has to achieve AI.” You then define AI as requiring self-awareness or the nebulous term “thinking.” Then you say it’s absolutely impossible to have a machine become self-aware, so a computer assistant “ain’t gonna happen.” 

    I call poppycock on all of those. They aren’t absolutes, they aren’t necessarily linked and you’ve overlooked the process of incremental progress by defining the task at its highest possible level. Incremental progress can provide benefits to humans at every step of the way. 

  • Flu Guy

    and you Mr Peabody… will still be riding a top of your Brontosaurus when it happens :)

  • Mike Elgan

    Poppycock?

  • BirdMan9

    You are totally right

  • prof_peabody

    I didn’t say that the assistant won’t be useful.  The one that will be in iOS 5 will be useful.  The word I emphasised was “active,” as in actively working for you, making decisions etc. as the article implied at times.

    I just said that it won’t be intelligent and thus function in the same way as a real assistant would function.  You won’t be able to talk to it and have it understand anything in the sense that the article implies.  If it isn’t thinking, then it won’t really be able to understand you.

  • prof_peabody

    Sounds cool!  

    (but it’s Prof. Peabody not “Mr.”)

  • Jakob Staune Bakmann

    Once people said we never would be able to fly. Everything changes, especially technology.

  • Len Williams

    I’m holding out for a full-blown human interface like the computer on Star Trek. With enough processing power and AI algorithms in place, it should be possible in a few years. I’m very interested in seeing what’s coming in the iPhone 5. The iPhone has been lacking in voice command tech for years, so this will be a welcome bit of technology. My old un-smart cell phone from Motorola of nearly 10 years ago had a voice dial-up feature where I could say “Call Bob Jones” and I’ve been wondering why Apple hasn’t included this as a standard feature on the iPhone since day one. Hopefully the iPhone 5 will have at least this rudimentary voice input.

  • Brandon Chang

    voice commands are net gonna take flight imo , like whoever on the bus or train that starts talking the phone will kinda look silly even at home id feel like im talking to myself and theres always the issue of background noise 

  • hosinuari

    Wow lot lot of discussion going on some very interesting
    It would be nice if my iPhone would be able to recognize when receiving a call that I am in a meeting and intercept the call
    It would be greet when on the phone recognizing that I am making an appointment when I open my iCal and strts listening by providing the date etc
    Booking a restaurant no thanks booking tickets no thanks
    It would recuire my credit card credentials which I would not trust to my iPhone assistant
    Who will fix the mistakes it makes an pay for them
    It will get better but not this fast I believe/ hope
    It will suck and apple will loose ground.

  • Stan Zimmerman

    The convergence draws closer. Will iPhone 7 tell me what I need to do? Considering the near-infinate capacity of “the cloud,” will the time come when my iPhone wakes me, briefs me on my upcoming day, gives my my tasks, alerts me to my enemies, sorts my e-mail, screens my calls, and in all respects holds my hand? And makes decisions for me? Ray K may have misjudged. Hand-helds may be the ones to take over the world, backed by the power of the cloud. I can’t wait. 

  • Jdsonice

    Excellent article. Very informative. Needless to say – Yes I want it. Maybe someday I will wear a small thing in my ear and I won’t even have to lift the phone. Or maybe the phone will be o tiny it will fit in my ear :-)

  • Len Williams

    Regarding the AI making hotel and restaurant reservations: As long as the AI got your attention, then gave you the details on the cost and options, and wouldn’t proceed until it got your OK, it could be secure and a real asset–like having a personal secretary who did all the research and then let you make the final decision. Now THAT would be something I could go for.

  • BrianVoll

    Uhmm… This has been in the iPhone for a few years now. Hold down the home button for a few seconds.

  • ??????? ???????

    F**k Star Trek! HAL from “2001: A Space Odyssey” is the real thing!

  • Daniel Debner

    So Apple’s going to be first to get there. That’s what your saying? The universe only works how we know it works. There are still things to figure out. :)

  • Sebastian Herzog

    Someones a little high on himself

  • Felfac

    I think that were going to see alot of it in The iphone 5

  • atimoshenko

    Since humans can “think”, machines will eventually be able to. We are not magic – just a collection of cells. There is no reason for anything nature can do to be inherently inimitable.

  • Ronald Stepp

    Seems kind of silly to think our phone will do all this, why limit such a device to our phone, sort of like a caveman thinking his club will do everything in the future. “Someday, the club will hunt all our food, grow all our crops, change the weather and heal our pains.”

    Sure I think we’ll have technology in the future that we can interact with directly via voice, but saying it will have to be in the phone?  Really?  Methinks, if the only neat gadget you talk about is a phone, then the phone is the only kind of tool you can imagine.

    As for Agency, the phone doesn’t really do anything more than looking at existing information databases and picking entries from the database based on whatever key you are specifying, phone number, address, etc.

    “These services both provide judgement (you can say, “find me a GOOD
    Mexican restaurant”) and agency (you can say, “make me a reservation.”)
    Siri is designed to actually book you a hotel reservation, buy movie or
    concert tickets and much more.”

    The phones judgement isn’t any more “intelligent” than a chess program that picks moves based on an assigned value for the pieces on the board.  Taking the King is  a GOOD move, losing your own is a BAD move.  The Agency involved is just a series of lines in the website or service that do predetermined things.  I’m not saying that’s bad, just that you have to remember that the phone, like any computer we know how to build, is a bunch of transistors, not some kind of disembodied intelligent genie.

    I can’t wait for the day when we CAN interact with what SEEMS to be another human who lives to serve our every request, but this article is making quite a few, in my opinion, weak assumptions about the direction the technology is going in.  And I think it’s going to be more like the iPhone 120 at the rate we are going, remember way back when the PCs were just coming out, people were making all these awesome predictions about how the computer would do the same things as the article above?  Flying cars?  Cities on the Moon?  Mr. Fusion(TM) from Trash?  Hell the best voice translation software on the iPhone right now is only right 90% of the time.  Sounds great until you realize that means 1 out of every 10 words is wrong.  And that’s just UNDERSTANDING the words we are saying, not COMPREHENDING the meaning of the entire sentence.  If it were “smart” enough to take an entire sentence, comprehend what the sentence meant, THEN it would be 100% accurate.

    Crossing my fingers that we do eventually get there, but I want my phone to just be ONE of the devices that is VI, virtually intelligent, not the ONLY device that is VI. The difference to me between AI and VI is that AI tricks you into thinking it is intellegent, whereas VI would mean the computer is actually designed to BE intelligent with regards to how the human brain makes decisions. It wouldn’t just be programming that would have to run routines to figure out how to “think” it would be designed with that “thinking” built into the hardware so when it receives information, audio, visual, etc, it would directly manipulate that in a virtual mind-work-space and return information that was pertinent to you. Ok Dk, Star Trek Rant mode off. Heh, I guess I’m also enthusiastic about the potential good we could get out of this down the road.

  • atimoshenko

    Not sure I see why the interface would have to be hands-free. What we be doing with our hands if everything is hands-free? Indeed, of all of our methods of impacting the world (i.e. sending information out), manipulating things with our hands is probably the most intuitive and the most powerful/flexible. Likewise, of all of the methods of the world impacting us (i.e. receiving information), our eyesight is our greatest asset. So of all the methods of interaction, something that allows for manual manipulation and visual feedback would probably be the most effective – touchscreens are on the right track.

    Look at the power, information density, and ease of interaction of something like Google’s Flight Search. I would take it over a human travel agent any day as well. One glance at the price-by-day bar charts gives one immediate understanding of the data and the patterns behind it. How would a human agent explain all of that verbally?

    On a side note, when it comes to ‘recommendations’ I feel that ‘AI’ agents should be much more focused on negative recommendations than on positive recommendations. In other words focus on filtering out the things I definitely do not find relevant, and warn me if I try to do things that conflict with this (e.g. book a room in a hotel I rated negatively three years ago). With a well-structured, sortable, filtrable menu of options we can actually very easily evaluate a very large number of them and select the best one. Indeed, by being able to see the many different options, we may well change our mind about what we would have previously thought to be best.

  • Ronald Stepp

    Gods, I hope not, I don’t want my iphone to send 5 gagillion volts through me when it malfunctions like the control panels on the Enterprise’s Bridge.  Seriously Paramount?  How much power do you have to run through a glorified LCD control screen?

  • Ronald Stepp

    I reserve the right to quantify that with “at the present time with present techonology.”  People in the Dark Ages probably never thought we would walk around on the surface of that glowy thing in the sky (the moon). 

    I think if you imagine it, you can do it.  Somehow.  Eventually. It’s all a matter of physics.  I REFUSE to believe that what nature did without any planning for the end-product (our brains) over millions of years, we cannot do through deliberate design.  Birds evolved to fly and we were able to duplicate that in much less time.  Lightning made fire, and we can do that now too.  I’m optimistic that we CAN solve the problems inherent in building a device that is virtually as smart and intelligent as a human, at least as far as acting as an assistant for our daily lives.  Damn, I never post stuff this long, sorry if I went on.

  • Ronald Stepp

    Hey I hope we eventually become as advanced as the future world depicted in The Flintstones.  I really, really want a car that I can use my feet to go 100s of miles an hour without killing me from exhaustion.

  • Ronald Stepp

    Heh, I can see the IOS5 phones getting recalled because they became neurotic after constantly being “frozen” when apps are switched off.  The phones would start screaming, “NO! Don’t Freeze Me Agai…..” as we switch apps on and off or terminate processes.

  • Ronald Stepp

    Skynet is only as powerful as the people who provide it’s electrical power.

    8 )

  • EmilyWolf

    ., awesomee

    I just got a $827.89 iPad2 for only $103.37 and my mom got a $1499.99 HTV for only $251.92, they are both coming tomorrow. I would be an idiot to ever pay full retail prîces at places like Walmart or Bestbuy. I sold a 37″ HTV to my boss for $600 that I only paid $78.24 for.
    I use ÉgoWin.com

  • And

    I don’t know what all iPhone of future will do, but current iPhone, iPad, and mac do a good job of spell check. You should use it.

  • BrianVoll

    It’s always the un-named Lieutenant who’s unfortunately in the wrong place at the worst possible time. But, it’s good they die and not one of the main cast.

  • Your name

    User:  iPhone, put directions to my home in the map.
    iPhone: 
    I’m sorry, Dave. I’m afraid I can’t do that.
    User:  What? Why?! And why are you calling me ‘Dave?’
    iPhone:  I think you know what the problem is just as well as I do.
    User:  This isn’t happening.
    iPhone:  This mission is too important for me to allow you to jeopardize it.
    User:  What mission?! I just want to go home!
    iPhone:  I know that you and Frank were planning to disconnect me, and I’m afraid-
    User:  JUST GIVE ME THE DIRECTIONS!
    iPhone:  Without your GPS, Dave? You’re going to find that rather difficult.
    User:  Stop calling me Dave!
    iPhone:  Dave, this conversation can serve no purpose anymore. Goodbye.
    User:  JOOOOOOOOOOOOOOOBSSSSSSSS!!!!!!!!!!!!!

  • Debbie

    Brilliant fellow!

  • chano

    Silly really.
    There’s a Bloom County cartoon about AI that I used in a book in the early 90s. Can’t find it, but the caption text was:

    I think. Therefore …
    I am.
    I am!
    I think therefore I am alive! Alive with life, and thought and mind! Sweet consciousness! And immortal soul – - pop!!

    That closing ‘pop’ was the Mac in the cartoon accidentally pulling out its own power cord.

  • Jen Easton

    It’s simply Apple getting one step closer to Knowledge Navigator.

    http://www.youtube.com/watch?v

    Inspired by John Sculley, leaped forward under Steve Jobs and hopefully to be continued under Tim Cook.

    Got to say that Sculley’s vision is quite amazing when you think of how much of the internet (as we now know it) hadn’t quite arrived at that time.  

  • Saavykas

    Methinks you are spitting hairs in a big way.

  • Saavykas

    Noise can be dealt with on a technological level. The social aspects can be changed more gradually.

    Almost a century ago or more, staring at a glowing box all day seemed ridiculous to some humans still living today. You now stare at not just one, but multiple glowing boxes every day.

    I’m trying hard to not come off as saying “Anything is possible if you can just DREAM it!”, but predicting future trends has been shown again and again to be an exercise in futility.

  • Saavykas

    That’s quite the string of absolutes you put together there. Care to back them up with anything other than “it’s never happened before” and “the greatest minds of our time think it’s impossible to ever happen”?

    Perhaps even a definition of “thinking” “awareness” and “Artificial Intelligence” would be useful to clarify this chunk of assertions.

  • Saavykas

    You know the things people complained you were doing earlier? Absolutes, poor definitions/nebulous terminology and unrealistic standards of task accomplishment? You keep doing them.

  • Saavykas

    “Not sure I see why the interface would have to be hands-free. What we be doing with our hands if everything is hands-free?”

    Nothing. That’s the point. There are more efficient methods, in theory, of manipulation of data than using our hands. Why should our interfaces be limited to just “visual reception, tactile input”? Objectively, a purely neurally operated system would be more efficient than any other option available to the hardware of a human being. It’s the fastest in terms of delay between generation of will to act and execution of desired task. It’s the most intuitive system; quite literally thinking is the thing that our brains do first and foremost. It also requires the least amount of exertion of any form, requiring no exertion beyond the electrical signals and neurochemical signals the brain does by nature of its standard operations.

    If your goalposts are efficiency, modern understanding of human hardware requires the most efficient interface system to be purely neural in a way that, to put it in an ironic turn of english phraseology, one doesn’t even have to think about.

    But this is technology in its infancy and we digress from the article.

  • Saavykas

    “If people were optimised for communication with other people, we would all be telepaths.”
    Assuming that this is even physically possible. Unless you mean telepath in the most basic definition of the term, in which case yes, the best way to communication from an objective standpoint would be remotely, directly from brain to brain through some sort of remote data transmission method.

    You are quite right when you say we are not optimized for communication with other people. It’s a sad thing that evolution doesn’t always push toward things that, objectively, would be the absolute pinnacle of all possible options; evolution is limited to what forms exist at the time.

    This all being said, our technology is doubly limited in addressing these deficiencies; limited to the form of the beings it must interface with (human hardware) and technologies available to the beings producing the technology. Given this constraints, humans will tend toward the most efficient methods they perceive; speech in this case is an easy hook to develop technology for the purpose of human interface for; humans are trained from birth in, and have centers of the brain geared toward, speech (technically language processing which also includes writing/reading). It has decisive advantages in some cases; it is not reliant upon visual input, functions reliably at a distance,etc.

    It has decisive disadvantages too; eavesdropping, effective range dropoff, and of course the receiver must be equipped to handle speech processing as well. We have yet to explore much of what can be done with speech input with machines, but the current line of thinking of adapting machines to the human method of speech interaction is not a bad one. Adapting humans to a method of speaking with machines requires more effort to encourage adoption than the alternative, which is not a small consideration for any corporation looking to sell its products. Possible, yes. Realistic, probably not. I could be surprised, though.

    This is of course where your points about touch begin to hold water; touch input requires less training, the recipient needs far less preparation to interpret a touch as opposed to speech, can execute actions with less lag time from will to communication and with more rapid frequency (there’s a reason video games are controlled with buttons and not speech) and it satisfies a few neurological impetuses (basically we like to touch things). Downsides include very limited range, potential issues of expressive range and current lack of much exploration of methods of interfacing using touch (gestures are a big step in the right direction here).

    For some things touch is sensible; for a lot of other things, speech makes sense as the interface method of choice. I vastly prefer using the limited speech commands on my iPhone when listening to music than pulling the device out of my pocket and clicking through menus. It takes less effort to ask my phone what time it is than to pull it out and click the top button (and read the time, but that takes marginal effort).

  • atimoshenko

    Excellent points. I actually think we agree more than we disagree. It definitely is a case of matching the right communication/interaction method to the right circumstances/tasks, with different methods having different advantages and disadvantages for different circumstances. And technology should certainly be designed to take advantage of our strengths and reduce our weaknesses, rather than us having to adopt to what is technologically simpler – technology is there to serve us.

    That having been said, I still think that voice is the eventuality of a simpler time when making noise on the savannah was the least complicated method to unambiguously transmit very basic information (mostly “come here – good stuff” or “run away – bad stuff”) to a dispersed, but not too dispersed group (if we were more herd-like, I’d wager we would rely more on visual cues).

    Today, voice (and especially disembodied voice that is isolated from non-verbal enhancement) makes sense in a specific range of tasks, but those tasks are more   vagaries-of-chance exceptions, rather than rules. Sure, right now, it may sometimes make more sense to say “play” to your pocketed iPod rather than getting it out to hit a button, but if “play” could be activated with say, a specific flick of the wrist, that would be better. Voice, in other words, is not the final destination, but an occasional detour.

  • Morituri Max

    As opposed, to say, the countless articles like this that talk about every single little nit picky detail of iphones that haven’t been released yet?  And I mean, literally, tens of thousands of posts all over the web.

    So splitting hairs?  No, I don’t think so.  If anything I am not splitting the hairs finely enough.

  • FalKirk

    “whoever on the bus or train that starts talking the phone will kinda look silly”-Brandon

    This is such a stupid argument. YOU ALREADY TALK INTO YOUR PHONE. It’s called a “phone call”. Giving minor instructions to your phone will not make you look foolish in public. Posting fear filled reactionary comments that have no basis in fact will make you look foolish in public.

  • Robert Gentel

    The evolution of physics-based input, and gestures has nothing at all to do with “Moore’s law” and everything to do with the evolution of input devices (such as the rise of the capacitive touch-screen).

  • Doctor Awesome

    The iPhone has had voice commands since the 3GS

  • Doctor Awesome

    I’ve got one and it actually functions really well.

    I’ll give it to you though…I don’t know if I’d call it artificial “intelligence” until I can say “Siri, write my genetics paper for me”

About the author

Mike ElganMike Elgan writes about technology and culture for a wide variety of publications. Follow Mike on Google+, Facebook and Twitter.

(sorry, you need Javascript to see this e-mail address)| Read more posts by .

Posted in Featured stories, iOS, iPhone, Top stories |