UPDATE 12/16: Apple has told The Verge that its CSAM photo-scanning plan is still on hold, and that plans to roll it out later haven’t changed.
Apple has quietly removed all references to its controversial plan to scan iCloud Photos libraries for child sexual abuse material from its website. Back in August, Cupertino announced its intention to trawl through users’ pictures to detect CSAM material.
However, after encountering significant criticism from experts, rights groups and even its own employees, Apple shelved the feature. The company said in September that it had “decided to take additional time” to collect input and make improvements to the feature. But it’s now unclear whether it will go ahead with CSAM photo scanning at all.
Apple’s controversial CSAM plan
Apple’s original plan was to use a system called neuralMatch to unearth suspected child abuse images in user photo libraries uploaded to iCloud. It also planned to employ human reviewers to verify that the material was illegal.
Once a match was made and verified, Apple planned to report it to relevant local authorities. The company’s intentions were obviously good. But it turns out people weren’t happy about the idea of having their private photos scanned.
Soon after the CSAM plan announcement, Apple faced a barrage of criticism from privacy advocates, rights groups and organizations like the Electronic Frontier Foundation. Even its own employees quietly joined the backlash.
Apple quickly published a more detailed guide to CSAM photo scanning in an effort to quell the concerns, but it made little difference. Just a month after announcing the plan, Apple put it on hold.
Apple’s Child Safety page no longer mentions CSAM
“We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement in September.
It seemed as though Apple still intended to go ahead with the feature, originally supposed to roll out in an iOS 15 update, eventually. Now, all mentions of CSAM are gone from Apple’s Child Safety webpage.
It remains unclear what this means for CSAM scanning. Although Apple seemed determined to push forward with the feature, the latest developments suggest the company might have quietly scrapped the idea entirely.
We asked Apple for clarification and will update this post if we get a response.