Trying to thumb type a search query into your iPhone on the run sucks, and it’s sow to boot. Google knows it, which is why they have the Google Search app, allowing you to just dictate your search query when typing is otherwise inconvenient.
But it looks like Apple might have noticed it too. New job postings indicate Apple is looking to improve the native voice recognition capabilities of iOS.
Apple is now searching for four new engineers who will focus on iOS’s voice technologies, breaking down into the following titles: one iOS Speech Application Engineer, two Speech Recognition Engineers and a Senior Speech Research Scientist. Candidates must have real world experience, and Apple specifically makes a shout out to the teams who have worked on “Nuance Recognizer, IBM WebSphere Voice, Google Voice Search.”
There’s a lot of obvious possibilities for this tech, but voice recognition input won’t just ease the burden on the onscreen virtual keypad… it’ll also make iOS even more accessible to the visually impaired than it already is, which is best of class.
Exciting stuff. I can’t wait to see what sort of magic through voice recognition Apple bakes into iOS 5..