Apple on Friday confirmed it has delayed controversial plans to start scanning user photos for child sexual abuse material, aka CSAM.
The feature was originally scheduled to roll out later this year. Apple now says it will take time to “collect input and make improvements” before deploying the changes. However, the feature is far from canceled altogether.