Apple plans to scan iPhones and iCloud for child abuse imagery [Updated]

By

Learn the financial lingo to get the most out of earnings call chatter.
Learn the financial lingo to get the most out of earnings call chatter.
Photo: Kevin Dooley/Flickr CC

Apple plans to scan photos stored on peoples’ iPhones and in their iCloud accounts for imagery suggesting child abuse, according to news reports Thursday. The effort might aid in law-enforcement investigations, but also could invite controversial access to user data by government agencies.

Apple’s update to its web page “Expanded Protections for Children” — see under the “CSAM Detection” subheading — appears to make the scanning plan official. CSAM stands for “child sexual abuse material.”

The system Apple is putting in place, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times reported Thursday.

Apple told academics in the United States about the plan this week and could share more “as soon as this week,” said two security researchers briefed on the matter, according to the Financial Times.

Suspect photos compared to database images

The newspaper added that experts “trained” neuralMatch using 200,000 images from the National Center for Missing & Exploited Children. Suspect photos will be hashed and then compared with images of child sexual abuse in a database.

The system rolls out first in the United States and later elsewhere, FT reported.

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S; will be given a ‘safety voucher,’ saying whether it is suspect or not,” FT said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

Concerns over the system

The Verge noted that Johns Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”

“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”

Apple and others already do something similar

Apple and other major cloud providers already check files against known images of child abuse. But the neuralMatch system goes beyond what they do, as it allows centralized access to local storage, The Verge said.

Observers note that it would be easy to extend the system to other crimes. In a country like China, where Apple does considerable business, that kind of access and legal application could prove worrisome.

Apple on privacy

Apple has made much of its devices’ privacy protections, including recent statements by CEO Tim Cook in a video directed at the privacy-conscious European Union.

The company dominated the news briefly when it resisted assisting the FBI’s demand that it provide the bureau access to an iPhone belonging to a shooter in a 2015 attack in San Bernardino, California.

News outlets reported Thursday that Apple has not responded to requests for comment on its plan to scan images.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.