Apple is already scanning your emails for child abuse material

By

iCloud Passwords land on Windows
iCloud Mail accounts are banned for sharing CSAM.
Photo: Apple

Many Apple fans are upset about the company’s plan to start scanning for child abuse material (CSAM) in iCloud Photos uploads later this year. But did you know that Cupertino has already been scanning for CSAM in your emails?

Apple has confirmed that it started detecting CSAM using image matching technology in iCloud Mail back in 2019. It says that accounts with CSAM content violate its terms and conditions and will be disabled.

Much has been said about Apple’s CSAM photo scanning plan. Privacy advocates don’t like it. Some of Apple’s own employees have voiced their concerns about it. Rights organizations have urged Tim Cook to kill it before it even starts rolling out to iPhone users in the U.S.

It seems, however, than CSAM scanning isn’t new at Apple. Although the company hasn’t scanned out iCloud Photos in the past, it has been quietly checking our iCloud Mail for child abuse material.

Apple confirms it checks iCloud Mail for CSAM

“Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019,” writes Ben Lovejoy for 9to5Mac. “Apple also indicated that it was doing some limited scanning for other data, but would not tell me what that was.”

The vast majority of iCloud Mail users likely had no idea this was happening, but Apple didn’t keep it a secret. An archived version of its child safety website states that “Apple uses image matching technology to help find and report child exploitation.”

“Much like spam filters in email, our system use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

Apple chief privacy officer Jane Hovarth also confirmed the practice at CES in January 2020. “We are utilizing some technologies to help screen for child sexual abuse material,” she said during a panel she participated in, without providing further details on the technology Apple was using then.

Apple steps up CSAM scanning

The reasons behind Apple’s decision to expand CSAM scanning still aren’t completely clear. But according to a conversation between Apple employees in 2020, uncovered as part of the company’s ongoing legal battle against Epic Games, anti-fraud chief Eric Friedman described Apple as “the greatest platform for distributing child porn.”

Despite that statement, it is believed that the total number of CSAM cases uncovered by Apple in iCloud Mail each year is “measured in the hundreds.” That doesn’t seem that significant — although just one is entirely unacceptable — considering billions of Apple devices are used globally.

The expansion could have something to do with the fact that a number of Apple’s rivals, including Microsoft, are also scanning for CSAM content. Apple may have felt that it doesn’t look good if other platforms are working to stamp out CSAM while Apple turns a blind eye.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.