Apple scans photos uploaded to iCloud to check if they contain child abuse

By

Apple ditched plans for secure iCloud backups after FBI concern
Apple's chief privacy officer discussed the tech in a CES panel.
Photo: Jim Merithew/Cult of Mac

CES-2020-bug-2

Apple’s chief privacy officer says that Apple scans photos uploaded to iCloud to check whether they contain child abuse. Jane Horvath discussed the use of the technology during a Tuesday panel on user privacy at CES.

Horvath didn’t reveal exactly how Apple carries this out. Many companies — including Facebook, Twitter and Google — already use a Microsoft-developed tool called PhotoDNA. This checks images against a database of previously identified pictures.

During the panel she participated in, Horvath noted only that: “We are utilizing some technologies to help screen for child sexual abuse material.”

It’s not clear when Apple started scanning images in this way. However, last year it made a change to its privacy policy stating that it may scan images for child abuse material. On Apple’s website it features a disclaimer stating that:

“Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

Apple scours iCloud images for possible child abuse

Apple’s challenge is balancing law enforcement with privacy. Any decent person would support efforts to crack down on child abuse. But the question of whether or not it’s okay to scan massive amounts of user data to find wrongdoers is a big, immensely complex topic.

Apple has previously had a standoff with the FBI on the subject of privacy. In that instance, Apple came down on the side of keeping users’ data private.

Apple already uses image recognition technology to identify people and objects in photos. This is done using machine learning. ML is an area Apple has increasingly invested in in recent years. However, from the sound of things, when it comes to spotting child abuse, Apple’s technology in this area is focused more on matching images with already reported ones.

In 2018, Apple removed Tumblr from the App Store. This was reportedly because it contained child pornography, which somehow managed to get around Tumblr’s filters.

Source: Telegraph

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.