The European Commission's draft law could force companies to detect, remove and report CSAM. Photo: European Commission
According to a new report, the European Commission could release a draft law this week requiring tech companies like Apple and Google to better police their platforms for illegal images of child sexual abuse, known as CSAM.
The law would require tech companies to detect, remove and report images to law enforcement.
UPDATE 12/16: Apple has told The Verge that its CSAM photo-scanning plan is still on hold, and that plans to roll it out later haven’t changed.
Apple has quietly removed all references to its controversial plan to scan iCloud Photos libraries for child sexual abuse material from its website. Back in August, Cupertino announced its intention to trawl through users’ pictures to detect CSAM material.
However, after encountering significant criticism from experts, rights groups and even its own employees, Apple shelved the feature. The company said in September that it had “decided to take additional time” to collect input and make improvements to the feature. But it’s now unclear whether it will go ahead with CSAM photo scanning at all.
Home Secretary Priti Patel, who this week announced the Safety Tech Challenge Fund, called on “Big Tech” to take responsibility for public safety and find ways to monitor online platforms protected by encryption.
The Electronic Frontier Foundation (EFF) has called on Apple to completely abandon its child safety features after their rollout was delayed.
The group says it is “pleased” Apple’s move is on hold for now. But it calls the plans, which include scanning user images for child abuse material (CSAM), “a decrease in privacy for all iCloud Photos users.”
The EFF’s petition against Apple’s original announcement now contains more than 25,000 signatures. Another, started by groups like Fight for the Future and OpenMedia, contains more than 50,000.
Apple will take time to "collect input and make improvements." Photo: Kevin Dooley/Flickr CC
Apple on Friday confirmed it has delayed controversial plans to start scanning user photos for child sexual abuse material, aka CSAM.
The feature was originally scheduled to roll out later this year. Apple now says it will take time to “collect input and make improvements” before deploying the changes. However, the feature is far from canceled altogether.
iCloud Mail accounts are banned for sharing CSAM. Photo: Apple
Many Apple fans are upset about the company’s plan to start scanning for child abuse material (CSAM) in iCloud Photos uploads later this year. But did you know that Cupertino has already been scanning for CSAM in your emails?
Apple has confirmed that it started detecting CSAM using image matching technology in iCloud Mail back in 2019. It says that accounts with CSAM content violate its terms and conditions and will be disabled.
The largest campaign so far against Apple's new child safety features. Photo: Benjamin Balázs
An international coalition of more than 90 policy and rights groups is urging Apple to drop plans to scan user photos for child abuse material (CSAM).
In an open letter addressed to Apple CEO Tim Cook, published on Thursday, the coalition said it is concerned the feature “will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for children.”
Despite Apple's claims it supports independent security research. Photo: Corellium
Just when it looked like Apple’s lengthy legal battle with Corellium was finally over, Cupertino on Tuesday appealed a copyright case it previously lost in an effort to take down the firm’s iPhone virtualization platform.
The news is somewhat surprising after Apple last week settled other claims against Corellium, in what experts called a significant win for security research. And it contradicts Apple’s own stance on validation.
It is offering funding and free access to its iPhone virtualization platform. Photo: Corellium
Security research firm Corellium on Monday revealed its new Open Security Initiative, which will support independent research into the privacy and security of mobile apps and devices. Its first target is Apple’s controversial CSAM scanning feature, set to roll out to iPhone users later this year.
Corellium said it applauds Apple’s commitment to holding itself accountable, and it believes its platform of virtual iOS devices is best for supporting any testing efforts. It hopes that researchers will use it to uncover “errors in any component” of Apple’s feature, which could be used to “subvert the system as a whole, and consequently violate iPhone users’ privacy and security.”
Some inside Apple aren't happy with the move. Photo: Cult of Mac
Apple employees have begun voicing their concerns over the company’s plan to scan user photos for child abuse material (CSAM), according to a new report. Many are said to have taken to internal Slack channels to express worries over how the feature could be exploited by governments.
“More than 800 messages” have been shared on one channel during a “days-long” discussion about the move. It comes after a number of privacy advocates this week spoke out against Apple’s announcement, calling it mass surveillance and warning that it could set a dangerous precedent.