Apple delays plan to scan user photos for child abuse material

By

Apple reportedly will scan images in iPhones and iCloud for hints of child abuse.
Apple will take time to "collect input and make improvements."
Photo: Kevin Dooley/Flickr CC

Apple on Friday confirmed it has delayed controversial plans to start scanning user photos for child sexual abuse material, aka CSAM.

The feature was originally scheduled to roll out later this year. Apple now says it will take time to “collect input and make improvements” before deploying the changes. However, the feature is far from canceled altogether.

Apple looks to ease CSAM photo scanning concerns with new FAQ

By

Apple CSAM photo scanning
Clearing up the confusion.
Photo: Apple

Apple defends its plan to scan user photos for child sexual abuse imagery in a newly published FAQ that aims to quell growing concerns from privacy advocates.

The document provides “more clarity and transparency,” Apple said, after noting that “many stakeholders including privacy organizations and child safety organizations have expressed their support” for the move.

The FAQ explains the differences between child sexual abuse imagery scanning in iCloud and the new child-protection features coming to Apple’s Messages app. It also reassures users that Apple will not entertain government requests to expand the features.

Edward Snowden, privacy advocates speak out against Apple’s photo scanning plan

By

Apple photo scanning
A "slippery slope" that could lead to mass surveillance.
Photo: @Privacyfan2021

Whistleblower Edward Snowden and other privacy advocates are speaking out against Apple’s plan to scan user photos for child abuse imagery.

The move will turn everybody’s iPhone into an “iNarcs,” Snowden said on Twitter. “If they can scan for kiddie porn today, they can scan for anything tomorrow.” The Electronic Frontier Foundation (EFF) is also against the plan.