According to a new report, the European Commission could release a draft law this week requiring tech companies like Apple and Google to better police their platforms for illegal images of child sexual abuse, known as CSAM.
The law would require tech companies to detect, remove and report images to law enforcement.
EU draft law set to force tech companies to remove and report CSAM
As indicated by a leak of the proposed law that Politico obtained, the EC thinks voluntary measures to curb child sexual abuse material (CSAM) some digital companies have put in place have “proven insufficient.”
So the commission wants to make detection, removal and reporting of such material mandatory.
Groups representing tech companies and children’s rights organizations, after months of lobbying, are waiting to see how stringent the proposed law’s rules will be.
Also, a question remains about how the rules will work without tech companies having to scan all user content. The Court of Justice of the European Union deemed that illegal in 2016.
In addition, privacy groups and tech companies are worried that the EU legislative action could result in the creation of backdoors to end-to-end encrypted messaging services, outside of the control of the hosting platform.
EC Home Affairs Commissioner Ylva Johansson has said technical solutions exist to find illegal content while keeping conversations safe, but cybersecurity experts disagree.
“The EU shouldn’t be proposing things that are technologically impossible,” Ella Jakubowska told Politico. Jakubowska is policy adviser at European Digital Rights (EDRi), a network of 45 non-governmental organizations (NGOs).
“The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented,” said Jakubowska.
Members of the European Parliament (MEPs) express differing views on the issue. Centrist Renew Europe MEP Moritz Körner told Politico the EC proposal would mean “the privacy of digital correspondence would be dead.”
Debate recalls Apple’s controversial CSAM scanning plans
The debate recalls last year’s controversy around Apple’s plan to search for CSAM on iPhones and iPads.
In August 2021, Apple said it planned a suite of new child safety features, including scanning users’ iCloud Photos libraries for CSAM and communications warning children and their parents when receiving or sending sexually explicit photos. The communications feature is already live on Apple’s iMessage platform. Apple has not deployed CSAM scanning.
Apple’s plans — especially the CSAM scanning — drew criticism from security researchers, the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians, policy groups, university researchers and even some Apple employees. The scanning was derided as a form of surveillance and its effectiveness at identifying images was also called into question.
After initially trying to reassure critics, delay the rollout of CSAM.
Apple said its decision to delay was “based on feedback from customers, advocacy groups, researchers and others … we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
In December 2021, Apple quietly nixed all mentions of CSAM from its Child Safety webpage. But Cupertino said its plans for CSAM detection have not changed since September, suggesting it could still be coming.