Apple explains why it dropped controversial plan to scan iCloud Photos for CSAM

By

Apple drops controversial plan to scan iCloud Photos for CSAM
An Apple executive went in depth on why images in iCloud Photos are not scanned for child sexual abuse material.
Image: Apple

Apple gave a more complete explanation of why it cancelled a plan to scan iCloud Photos libraries for child sexual abuse material. It’s the same reason it gave back in 2022, but with more detail.

It all comes down to user privacy, and the potential for the system to be abused by hackers and repressive governments.

Why Apple’s CSAM scanning plan never happened

Apple’s original plan, announced in 2021, was to use a system called neuralMatch to unearth suspected child abuse images in user photo libraries uploaded to iCloud. It also planned to employ human reviewers to verify that the material was illegal. Any CSAM images located would have been reported to relevant local authorities.

But in late 2022, it dropped this plan. At the time, it said simply, “Children can be protected without companies combing through personal data.”

Recently, a child safety group called Heat Initiative protested the decision and asked Apple to reverse it. Erik Neuenschwander, who is in charge of user privacy and child safety at Apple, responded at length.

In a email also sent to Wired, Neuenschwander said, “Child sexual abuse material is abhorrent.” But he went on to say:

“We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago for a number of good reasons. After having consulted extensively with child safety advocates, human rights organizations, privacy and security technologists, and academics, and having considered scanning technology from virtually every angle, we concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users.

“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit.

“It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories.

“Scanning systems are also not foolproof and there is documented evidence from other platforms that innocent parties have been swept into dystopian dragnets that have made them victims when they have done nothing more than share perfectly normal and appropriate pictures of their babies.”

The full text of the email exchange between Heat Initiative and Apple is available from Wired.

Apple’s alternative child safety plan

But not scanning iCloud images for CSAM does not mean Apple gave up on fighting child exploitation. It has continued to improve the Communication Safety feature that debuted in 2021.

This enables an iPhone to detect if a child gets or sends sexually explicit photos through the Messages app. The user is then warned. This process happens entirely on the handset, not on a remote server. And the messages remain encrypted.

Instead of examining pictures stored on iCloud for CSAM, Apple went the opposite direction by enabling users to encrypt pictures stored in iCloud Photos.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.