Apple plans to scan photos stored on peoples’ iPhones and in their iCloud accounts for imagery suggesting child abuse, according to news reports Thursday. The effort might aid in law-enforcement investigations, but also could invite controversial access to user data by government agencies.
Apple’s update to its web page “Expanded Protections for Children” — see under the “CSAM Detection” subheading — appears to make the scanning plan official. CSAM stands for “child sexual abuse material.”