UPDATE 12/16: Apple has told The Verge that its CSAM photo-scanning plan is still on hold, and that plans to roll it out later haven’t changed.
Apple has quietly removed all references to its controversial plan to scan iCloud Photos libraries for child sexual abuse material from its website. Back in August, Cupertino announced its intention to trawl through users’ pictures to detect CSAM material.
However, after encountering significant criticism from experts, rights groups and even its own employees, Apple shelved the feature. The company said in September that it had “decided to take additional time” to collect input and make improvements to the feature. But it’s now unclear whether it will go ahead with CSAM photo scanning at all.