Apple delays plan to scan user photos for child abuse material

By

Learn the financial lingo to get the most out of earnings call chatter.
Apple will take time to "collect input and make improvements."
Photo: Kevin Dooley/Flickr CC

Apple on Friday confirmed it has delayed controversial plans to start scanning user photos for child sexual abuse material, aka CSAM.

The feature was originally scheduled to roll out later this year. Apple now says it will take time to “collect input and make improvements” before deploying the changes. However, the feature is far from canceled altogether.

Apple’s initial plan, announced last month, was to scan all images uploaded to iCloud Photos to detect CSAM content. When it found a match, the images would be reviewed by a human before being reported to authorities.

Apple also announced features for devices used by children that would warn them before they opened or attempted to share content identified as sexually explicit. But the plans did not sit well with many Apple fans.

Apple delays CSAM plan

Users, privacy advocates and some Apple employees voiced their concerns over CSAM detection — and the potential for it to be expanded to detect other content later under outside pressure.

Apple now says the plan is on hold while it takes time to improve it. A statement issued to Cult of Mac on Friday read:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Apple also updated its original press release announcing the feature to include this statement.

The fight against CSAM scanning

Earlier this month, a group of more than 90 organizations wrote an open letter to Apple CEO Tim Cook, urging him to cancel the CSAM scanning plan. It also urged Apple to enter into discussions with civil society groups and vulnerable communities before rolling out similar features.

Prior to this, Apple attempted to ease concerns about CSAM scanning. The company published a lengthy FAQ that described its new child-safety features in more detail. Apple also laid out clearly exactly how each feature would work individually.

That document did little to quell the concerns, however. Large numbers of users still threatened to boycott Apple devices online, while experts continued to voice warnings about the potential pitfalls.

Apple’s decision to delay the plan confirms the company heard those concerns. It seems, however, that its plan to scan photos for CSAM in the future is far from dead.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.