| Cult of Mac

Apple drops controversial plan to scan iCloud Photos for CSAM

By

Apple drops controversial plan to scan iCloud Photos for CSAM
Images in iCloud Photos will not be scanned for child sexual abuse material.
Image: Apple

Apple completely abandoned its previously announced plan to scan iCloud Photos libraries for child sexual abuse material. The company will not go through users’ pictures on its cloud-storage servers looking for CSAM images.

Instead, Apple is going the opposite direction by enabling users to encrypt pictures stored in iCloud Photos.

Windows 11 Photos app can now access images in iCloud

By

More Apple services coming to Windows 11
Windows users get get a touch of sweet iCloud Photos goodness
Screenshot: Microsoft

Thanks to cooperation between Apple and Microsoft, the Windows 11 Photos app is now able to access photos and videos from iCloud.

This is one of several recent cross-platform collaboration moves by the two tech titans.

5 iOS 16.1 features to try right away

By

iOS 16.1 new features
iOS 16.1 packs plenty of useful features.
Image: Cult of Mac

Although Apple announced iOS 16 at WWDC22 and released it on September 12, not all promised features made their way into the first public build.

With iOS 16.1, which just became available Monday, Apple delivered many such promised features. Once you install it on your iPhone, check out the five iOS 16.1 features you should try right away.

Apple delays plan to scan user photos for child abuse material

By

Learn the financial lingo to get the most out of earnings call chatter.
Apple will take time to "collect input and make improvements."
Photo: Kevin Dooley/Flickr CC

Apple on Friday confirmed it has delayed controversial plans to start scanning user photos for child sexual abuse material, aka CSAM.

The feature was originally scheduled to roll out later this year. Apple now says it will take time to “collect input and make improvements” before deploying the changes. However, the feature is far from canceled altogether.

Apple looks to ease CSAM photo scanning concerns with new FAQ

By

Apple CSAM photo scanning
Clearing up the confusion.
Photo: Apple

Apple defends its plan to scan user photos for child sexual abuse imagery in a newly published FAQ that aims to quell growing concerns from privacy advocates.

The document provides “more clarity and transparency,” Apple said, after noting that “many stakeholders including privacy organizations and child safety organizations have expressed their support” for the move.

The FAQ explains the differences between child sexual abuse imagery scanning in iCloud and the new child-protection features coming to Apple’s Messages app. It also reassures users that Apple will not entertain government requests to expand the features.

Edward Snowden, privacy advocates speak out against Apple’s photo scanning plan

By

Apple photo scanning
A "slippery slope" that could lead to mass surveillance.
Photo: @Privacyfan2021

Whistleblower Edward Snowden and other privacy advocates are speaking out against Apple’s plan to scan user photos for child abuse imagery.

The move will turn everybody’s iPhone into an “iNarcs,” Snowden said on Twitter. “If they can scan for kiddie porn today, they can scan for anything tomorrow.” The Electronic Frontier Foundation (EFF) is also against the plan.