Mobile menu toggle

Apple acquires Q.ai in $2 billion bet on next-gen AI ‘silent speech’

By

Apple acquires Q.ai in $2 billion bet on next-gen AI 'silent speech'
Q.ai technology could make Siri much easier to talk to.
Image: ChatGPT/Cult of Mac

Apple just paid $2 billion for Q.ai, an AI company with technology that could dramatically enhance the Siri voice assistant. It’s a stealth-mode Israeli artificial intelligence startup that develops advanced technologies to improve human/computer communication beyond basic speech recognition.

It’s the second largest acquisition in Apple history.

Siri could become the home of Q.ai speech recognition 

Apple is working hard to upgrade Siri for the new AI era. It’s reportedly going to add a chatbot to the voice assistant later this year, allowing users to have conversations with it.

But using voice to interact with AI quickly gets frustrating if the computer can’t hear or follow what the person is saying. That’s where Q.ai comes in. Its machine-learning tech helps devices understand whispered speech and better interpret sound in noisy or difficult environments.

“In an age where human communication is everything, we found a way to take it to the next level, enabling super high bandwidth, unprecedented privacy, accessibility, multilingualism and much more,”Q.ai said of itself. “Biology can only take us so far. Q will do the rest.”

Apple likes the tech so much it bought the company. And paid $2 billion for it, the Financial Times reported on Thursday. That’s not that far behind the $3 billion that Apple paid for Beats Electronics in 2014, the most it’s ever paid for an acquisition.

‘Silent speech’ could be critical for Apple’s AI pin and smart glasses

Tech that lets Siri understand spoken speech in noisy environments will be great for iPhone users, but it could be an even bigger boon for the AI pin that Apple is reportedly working on. Without a screen at all, that device will depend entirely on voice communication. And being able to follow whispers would let users continue to interact with the AI when they can’t speak loudly, like in a classroom.

And Q.ai also got into “silent speech” — systems that can detect and interpret tiny facial skin movements or micro-expressions that occur when a person mouths words, even without audible sound. These signals can be decoded into spoken content or commands.

This tech could be ideal for the AI-powered smart glasses that Apple is also reportedly developing. The glasses could watch the wearer’s face and let them interact with the AI without using their hands or making a sound.

  • Subscribe to the Newsletter

    Our daily roundup of Apple news, reviews and how-tos. Plus the best Apple tweets, fun polls and inspiring Steve Jobs bons mots. Our readers say: "Love what you do" -- Christi Cardenas. "Absolutely love the content!" -- Harshita Arora. "Genuinely one of the highlights of my inbox" -- Lee Barnett.

Leave a Reply