Malicious Siri voice commands can hide in regular music

By

Siri Alexa in voice report
Instead of helping you, Siri could be obeying voice commands concealed in the song you're playing.
Photo: Apple

Suppose you’re listening to some music, then glance over an realize your iPhone has loaded a porn site all on its own. That’s the nightmare scenario researchers say is possible after they proved voice commands can be concealed in songs.

It’s also possible to get Siri to recognize commands given at frequencies outside the range of human hearing.

Researchers at Berkeley just published a paper saying that they successfully concealed voice commands in music.  Proof of their CommanderSong concept can be found online. This includes several “before and after” audio examples for the curious to listen to.

This is a follow up to a project to embed commands in seemingly-innocent spoken text. One of the researchers, Nicholas Carlini, posted several examples on his website. He also included audio examples.

Hidden voice commands

And the commands to Siri don’t have even to be audible to you. Researchers at Princeton and China’s Zhejiang University created “DolphinAttack,” which they used to order an iPhone to do anything Siri is capable of, from making phone calls to taking pictures.

The group demonstrated DolphinAttack in this video:

While there have been no reports of either of these voice-command exploits happening in the real world, Carlini told The New York Times “My assumption is that the malicious people already employ people to do what I do.”

But at least now that Apple, Amazon, and Google know that this is possible, they can act to prevent it.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.