Edward Snowden, privacy advocates speak out against Apple’s photo scanning plan

By

Apple photo scanning
A "slippery slope" that could lead to mass surveillance.
Photo: @Privacyfan2021

Whistleblower Edward Snowden and other privacy advocates are speaking out against Apple’s plan to scan user photos for child abuse imagery.

The move will turn everybody’s iPhone into an “iNarcs,” Snowden said on Twitter. “If they can scan for kiddie porn today, they can scan for anything tomorrow.” The Electronic Frontier Foundation (EFF) is also against the plan.

Apple confirmed on Thursday that it will, later this year, roll out new child safety features that include searching for Child Sexual Abuse Material (CSAM) in a user’s iCloud Photos library. The process will happen locally on a user’s device, in the interests of privacy, using technology called NeuralHash.

If CSAM is detected, Apple will report it to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works with U.S. law enforcement agencies. On the fact of it, it seems like a good idea — who doesn’t want to keep children safe? But some aren’t happy.

Snowden, the EFF, and others have expressed concerns with Apple’s plan. They called it “mass surveillance,” and warned that “a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

Turning billions of Apple devices into ‘iNarcs’

Of course, nobody is defending those who store or share child abuse images. The concern is that, although Apple’s intentions may be good, scanning user photos is a “slippery slope” that could lead to external pressure for wider surveillance measures.

“No matter how well-intentioned, Apple is rolling out mass surveillance,” Snowden warned. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs — without asking.”

A statement published by the EFF echoed these concerns and said that Apple’s decision “will come at a high price for overall user privacy.”

Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor…

“To say that we are disappointed with by Apple’s plans is an understatement,” the statement continued. The EFF called the move a “shocking about-face” from a company that users rely on to be the leader in privacy and security.

What next?

Also from the EFF statement:

We’ve said it before, and we’ll say it again now: It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.

The EFF warned that photo scanning could enable governments and law enforcement agencies to look for a whole host of other content — such as images that could be used as evidence in a criminal case against someone, or content that might be deemed as “counterspeech” under certain regimes.

Others speak out

Many others are speaking out against Apple’s photo-scanning plan. “Yikes. This type of surveillance is really Orwellian and destroys Apple’s claim to be a privacy-conscious tech company,” tweeted Ben Spielberg.

“I hate going all slippery-slope, but I look at the slope, and governments around the world are covering it in oil, and Apple just pushed it’s customers over the edge,” said Sarah Jamie Lewis.

“No one is defending explicit pictures of minors but this is category 5 insane,” tweeted Daniel Bostik. “How long until this is used to scan your phone for anti-government photos? How long till authorities in the middle east use this to track down LGBT people?”

Some Apple users claim they will avoid the company’s products in the future and choose Android or Linux devices instead as a result of the move. However, there are others who have welcomed Cupertino’s plan.

Not everyone is against it

“I’m against general surveillance, but if they can stick to hunting for child abusers then I’m all for it, if it saves just one child it’ll be worth it,” said Maria Dowler.

“As a father, I think it’s fantastic,” said Justin Davis. “Apple has proven their chops time and time again on encryption and privacy. To find a way to do this safely without comprising privacy is great.”

“Sounds good to me,” added Bill Leonard. “Don’t do weird shit.”

Some users are also pointing out that you can avoid Apple’s photo-scanning measures by simply disabling iCloud Photos. Although the process happens locally on your device, it can only check images that are uploaded to your iCloud Photo library.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.