Apple wants you to know that, at least for now, it has stopped listening to Siri queries made by users. It’s the right move to make. But it’s the unnecessary result of a backlash Apple brought upon itself.
The Siri eavesdropping controversy perfectly illustrates why Apple needs to be more transparent with users — even if that means sacrificing some ease of use.
Apple came under fire after a whistleblower revealed that contractors listened to recordings of users’ interactions with Siri. The recordings included “confidential medical information, drug deals, and recordings of couples having sex,” according to The Guardian.
The revelations rattled some users and flew in the face of Apple’s pro-privacy stance. The company subsequently reversed its policy and issued a statement saying it remains “committed to delivering a great Siri experience while protecting user privacy.”
The limits of ‘just working’
How did this happen at company that increasingly uses privacy as a selling point?
For starters, simplicity is baked into Apple’s DNA. It’s the tech company whose products “just work,” and where nothing is done in two button presses where one would suffice. Apple makes computing easy.
So what does that have to do with Apple’s latest Siri problem?
Apple was one of the first companies to jump on voice assistant technology. The company realized how compellingly simple it was for people to speak to their computers. No menus, tabs or opening apps. Just speaking in everyday language. There’s something magical about it. Apple knew this back in the 1980s and worked to make it a reality. Siri as we know it today finally shipped with the iPhone 4s.
Before Siri, the idea of talking to a computer seemed faintly threatening. The most common reference point was HAL 9000, the murderous A.I. from Stanley Kubrick’s 2001: A Space Odyssey.
Siri does not seem like HAL 9000
To counter that image, Apple worked to make Siri as likable and straightforward as possible. Telling users that certain random queries would be listened to and transcribed in a shadowy room somewhere did not fit with Apple’s fun approach for Siri. In fact, it’s scary.
Still, Apple didn’t exactly cover up the fact that this was going on. Anyone with any knowledge of modern A.I. systems knows they use copious amounts of training data. Those people could likely figure out what Apple was up to. Apple even mentioned it in a white paper (.pdf). Heck, maybe Apple buried a coded version of the revelation somewhere in the mountains of small print in the various user agreements we accept without reading. But it did not put this information front and center.
Apple made a call on behalf of its users, attempting to better their overall experience, without spilling too many details of the messy part of the “sausage-making” process.
Remind you of anything?
When I first heard about the Siri revelations, it immediately reminded me of another Apple controversy from a couple years ago: the iPhone-throttling accusations. In both cases, Apple’s mistake arguably wasn’t the technological solution it initially put into place. It was Apple’s misjudged lack of transparency.
In December 2017, we learned that Apple intentionally slowed the processing speed of iPhones with older batteries. While designed to prevent unwanted crashes, the throttling fed into the narrative that Apple engages in skullduggery to push users to upgrade their iPhones. Lawsuits followed.
In the end, Apple introduced a cheaper battery-replacement program. Apple also introduced a new software feature for switching throttling on and off.
Apple got a bunch of bad publicity and looked secretive. If the company had given users full information from the start, it wouldn’t have faced the same problems.
Apple needs to do better
In the Siri case, Apple is behaving no differently from Google and Amazon, both of which listen to some user requests. This provides training data that, ultimately, makes it possible to improve the services.
The problem is that Apple didn’t do a good enough job of explaining what it was doing. When a program crashes on your Mac, the computer asks if you want to send a crash report. Upon installing an app, Apple asks if you want to receive notifications. When you call a helpline, the company tells you explicitly that some calls are recorded for training and quality purposes.
Customers get a clear choice. When it comes to speaking to a robot assistant, however, people seem surprised about actual human involvement. Finding out that people might be listening in is not a good look for a company as privacy-conscious as Apple.
Apple stops Siri eavesdropping — for now
According to Apple’s statement, it will suspend its current practice of listening to select Siri queries for “grading” purposes. It also will release a future software update giving users the ability to choose whether they participate.
Since stopping humans from listening to (and grading) Siri requests will result in an inferior service, this second approach seems the smartest one. But you know what would have been even smarter? Offering this option from day one.
Hopefully, Apple will learn from this experience. Apple’s pro-privacy stance is great. But so is transparency. If Apple’s really going to be the “force for good” that CEO Tim Cook wants it to be, this is just as important. Even if it means the occasional disruption to a streamlined user experience.