90+ organizations urge Tim Cook to drop Apple’s photo scanning plan

90+ organizations urge Tim Cook to drop Apple’s photo scanning plan


Groups oppose Apple photo scanning
The largest campaign so far against Apple's new child safety features.
Photo: Benjamin Balázs

An international coalition of more than 90 policy and rights groups is urging Apple to drop plans to scan user photos for child abuse material (CSAM).

In an open letter addressed to Apple CEO Tim Cook, published on Thursday, the coalition said it is concerned the feature “will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for children.”

Apple has faced an overwhelming backlash against its controversial new child safety features, which will warn children when they attempt to view nude imagery in the Messages app, and scan for CSAM in iCloud Photos.

A growing number of users and privacy advocates have voiced their concerns about the features, with some threatening to ditch Apple devices entirely. Apple’s own employees have also joined the backlash.

Now, Apple faces the largest campaign against the move so far.

Rights groups join fight against photo scanning

The letter not only calls on Cook and Apple to scrap its new child safety features, scheduled to roll out later this year, but it points out why the features put children and other users at risk, “both now and in the future.”

Like others, the group warns of potential censorship and surveillance pitfalls. It also highlights a number of child safety risks it believes Apple may have overlooked by assuming that all children are protected by their parents.

“The undersigned organisations committed to civil rights, human rights, and digital rights around the world are writing to urge Apple to abandon the plans it announced on 5 August 2021 to build surveillance capabilities into iPhones, iPads, and other Apple products,” the letter begins.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

A threat to child safety?

“Algorithms designed to detect sexually explicit material are notoriously unreliable,” the coalition explains. “They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child.”

“Moreover, the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing.”

The features mean iMessage will no longer provide confidentiality and privacy to users who need it, the letter says. It also warns that once the “backdoor feature” is built in, governments could compel Apple to “detect images that are objectionable for reasons other than being sexually explicit.”

A ‘foundation for censorship, surveillance, and persecution’

On scanning user photos uploaded to iCloud, the group says it stands firmly against the proliferation of CSAM, but warns that Apple’s plan lays “the foundation for censorship, surveillance, and persecution on a global basis.”

Apple “will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” it says.

“Those images may be of human rights abuses, political protests, images companies have tagged as ‘terrorist’ or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them.”

The letter ends with a plea to Apple to scrap its new child safety features, and reaffirm its commitment to protecting user privacy. It also urges the company to consult with civil society groups, and with vulnerable communities “who may be disproportionately impacted” by such moves.

You can read the full letter, along with the list of signatories, on the Center for Democracy & Technology website.

Via: Reuters


Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.