CSAM

Read Cult of Mac’s latest posts on CSAM:

Apple explains why it dropped controversial plan to scan iCloud Photos for CSAM

By

Apple drops controversial plan to scan iCloud Photos for CSAM
An Apple executive went in depth on why images in iCloud Photos are not scanned for child sexual abuse material.
Image: Apple

Apple gave a more complete explanation of why it cancelled a plan to scan iCloud Photos libraries for child sexual abuse material. It’s the same reason it gave back in 2022, but with more detail.

It all comes down to user privacy, and the potential for the system to be abused by hackers and repressive governments.

EU may force tech giants to remove and report child sex abuse images

By

The European Commission's draft law could force companies to detect, remove and report CSAM.
The European Commission's draft law could force companies to detect, remove and report CSAM.
Photo: European Commission

According to a new report, the European Commission could release a draft law this week requiring tech companies like Apple and Google to better police their platforms for illegal images of child sexual abuse, known as CSAM.

The law would require tech companies to detect, remove and report images to law enforcement.

Apple deletes all mentions of controversial CSAM plan from its website [Update]

By

Apple CSAM photo scanning
What's going on with CSAM scanning?
Photo: Apple

UPDATE 12/16: Apple has told The Verge that its CSAM photo-scanning plan is still on hold, and that plans to roll it out later haven’t changed.

Apple has quietly removed all references to its controversial plan to scan iCloud Photos libraries for child sexual abuse material from its website. Back in August, Cupertino announced its intention to trawl through users’ pictures to detect CSAM material.

However, after encountering significant criticism from experts, rights groups and even its own employees, Apple shelved the feature. The company said in September that it had “decided to take additional time” to collect input and make improvements to the feature. But it’s now unclear whether it will go ahead with CSAM photo scanning at all.

UK backs Apple’s CSAM plans, offers rewards for new safety measures

By

UK backs Apple CSAM plan
Home Secretary Priti Patel wants tech firms to step up and be responsible for child safety.
Photo: Number 10 CC

The U.K. government has backed Apple’s plan to scan user photos for child sexual abuse material (CSAM) and is offering rewards of up to £85,000 ($117,600) to other technology firms who can develop new tools to keep children safe.

Home Secretary Priti Patel, who this week announced the Safety Tech Challenge Fund, called on “Big Tech” to take responsibility for public safety and find ways to monitor online platforms protected by encryption.

EFF urges Apple to completely abandon delayed child safety features

By

Apple urged to abandon child safety features
'Delays aren't good enough.'
Photo: Wiyre Media CC

The Electronic Frontier Foundation (EFF) has called on Apple to completely abandon its child safety features after their rollout was delayed.

The group says it is “pleased” Apple’s move is on hold for now. But it calls the plans, which include scanning user images for child abuse material (CSAM), “a decrease in privacy for all iCloud Photos users.”

The EFF’s petition against Apple’s original announcement now contains more than 25,000 signatures. Another, started by groups like Fight for the Future and OpenMedia, contains more than 50,000.

Apple delays plan to scan user photos for child abuse material

By

Learn the financial lingo to get the most out of earnings call chatter.
Apple will take time to "collect input and make improvements."
Photo: Kevin Dooley/Flickr CC

Apple on Friday confirmed it has delayed controversial plans to start scanning user photos for child sexual abuse material, aka CSAM.

The feature was originally scheduled to roll out later this year. Apple now says it will take time to “collect input and make improvements” before deploying the changes. However, the feature is far from canceled altogether.

Apple is already scanning your emails for child abuse material

By

iCloud Passwords land on Windows
iCloud Mail accounts are banned for sharing CSAM.
Photo: Apple

Many Apple fans are upset about the company’s plan to start scanning for child abuse material (CSAM) in iCloud Photos uploads later this year. But did you know that Cupertino has already been scanning for CSAM in your emails?

Apple has confirmed that it started detecting CSAM using image matching technology in iCloud Mail back in 2019. It says that accounts with CSAM content violate its terms and conditions and will be disabled.

90+ organizations urge Tim Cook to drop Apple’s photo scanning plan

By

Groups oppose Apple photo scanning
The largest campaign so far against Apple's new child safety features.
Photo: Benjamin Balázs

An international coalition of more than 90 policy and rights groups is urging Apple to drop plans to scan user photos for child abuse material (CSAM).

In an open letter addressed to Apple CEO Tim Cook, published on Thursday, the coalition said it is concerned the feature “will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for children.”

Apple isn’t done fighting Corellium’s virtual iOS devices just yet

By

Apple revises fight against Corellium
Despite Apple's claims it supports independent security research.
Photo: Corellium

Just when it looked like Apple’s lengthy legal battle with Corellium was finally over, Cupertino on Tuesday appealed a copyright case it previously lost in an effort to take down the firm’s iPhone virtualization platform.

The news is somewhat surprising after Apple last week settled other claims against Corellium, in what experts called a significant win for security research. And it contradicts Apple’s own stance on validation.

Corellium will support security testing of Apple CSAM scanning feature

By

Corellium Apple CSAM scanning
It is offering funding and free access to its iPhone virtualization platform.
Photo: Corellium

Security research firm Corellium on Monday revealed its new Open Security Initiative, which will support independent research into the privacy and security of mobile apps and devices. Its first target is Apple’s controversial CSAM scanning feature, set to roll out to iPhone users later this year.

Corellium said it applauds Apple’s commitment to holding itself accountable, and it believes its platform of virtual iOS devices is best for supporting any testing efforts. It hopes that researchers will use it to uncover “errors in any component” of Apple’s feature, which could be used to “subvert the system as a whole, and consequently violate iPhone users’ privacy and security.”

Apple employees reportedly join backlash over CSAM photo scanning

By

Apple.logo.paris.store
Some inside Apple aren't happy with the move.
Photo: Cult of Mac

Apple employees have begun voicing their concerns over the company’s plan to scan user photos for child abuse material (CSAM), according to a new report. Many are said to have taken to internal Slack channels to express worries over how the feature could be exploited by governments.

“More than 800 messages” have been shared on one channel during a “days-long” discussion about the move. It comes after a number of privacy advocates this week spoke out against Apple’s announcement, calling it mass surveillance and warning that it could set a dangerous precedent.

Apple looks to ease CSAM photo scanning concerns with new FAQ

By

Apple CSAM photo scanning
Clearing up the confusion.
Photo: Apple

Apple defends its plan to scan user photos for child sexual abuse imagery in a newly published FAQ that aims to quell growing concerns from privacy advocates.

The document provides “more clarity and transparency,” Apple said, after noting that “many stakeholders including privacy organizations and child safety organizations have expressed their support” for the move.

The FAQ explains the differences between child sexual abuse imagery scanning in iCloud and the new child-protection features coming to Apple’s Messages app. It also reassures users that Apple will not entertain government requests to expand the features.

Edward Snowden, privacy advocates speak out against Apple’s photo scanning plan

By

Apple photo scanning
A "slippery slope" that could lead to mass surveillance.
Photo: @Privacyfan2021

Whistleblower Edward Snowden and other privacy advocates are speaking out against Apple’s plan to scan user photos for child abuse imagery.

The move will turn everybody’s iPhone into an “iNarcs,” Snowden said on Twitter. “If they can scan for kiddie porn today, they can scan for anything tomorrow.” The Electronic Frontier Foundation (EFF) is also against the plan.