Today in Apple history: iCloud takes our files and photos to the sky

By

Steve Jobs shows iCloud to the world.
Steve Jobs called iCloud Apple's hard disk in the sky.
Photo: Apple

October 12: Today in Apple history: With iCloud launch, Apple moves beyond its digital hub strategy October 12, 2011: Apple launches iCloud, a service that lets users automatically and wirelessly store content and push it to their various devices.

iCloud’s arrival marks the end of Apple’s Mac-centric “digital hub” strategy — and ushers in an age of inter-device communication and non-localized files.

iPadOS 15 review: Nice improvements, but where’s the ambition?

By

iPadOS 15 review
Improvements are nice, but just not enough.
Image: Killian Bell/Cult of Mac

Apple somehow created the world’s greatest and most disappointing tablet operating system. iPadOS is by far the best you’ll find for larger touchscreens, and yet, it leaves us wanting so much more.

This year’s iPadOS 15 release is an incremental upgrade over its predecessor. It improves upon the split-screen multitasking system, adds some new features like Focus mode, and finally allows us to put widgets anywhere.

But it’s still iPadOS as we know it, and it’s still holding back iPad Pro. We could be doing so much more with the hardware, especially now that the newest models pack even-speedier M1 chips. But Apple won’t let us.

Here’s our full iPadOS 15 review. It lays out what’s good about the new operating system — and explains why we think it’s time for a little more ambition.

EFF urges Apple to completely abandon delayed child safety features

By

Apple urged to abandon child safety features
'Delays aren't good enough.'
Photo: Wiyre Media CC

The Electronic Frontier Foundation (EFF) has called on Apple to completely abandon its child safety features after their rollout was delayed.

The group says it is “pleased” Apple’s move is on hold for now. But it calls the plans, which include scanning user images for child abuse material (CSAM), “a decrease in privacy for all iCloud Photos users.”

The EFF’s petition against Apple’s original announcement now contains more than 25,000 signatures. Another, started by groups like Fight for the Future and OpenMedia, contains more than 50,000.

Apple is already scanning your emails for child abuse material

By

iCloud Passwords land on Windows
iCloud Mail accounts are banned for sharing CSAM.
Photo: Apple

Many Apple fans are upset about the company’s plan to start scanning for child abuse material (CSAM) in iCloud Photos uploads later this year. But did you know that Cupertino has already been scanning for CSAM in your emails?

Apple has confirmed that it started detecting CSAM using image matching technology in iCloud Mail back in 2019. It says that accounts with CSAM content violate its terms and conditions and will be disabled.

Apple employees reportedly join backlash over CSAM photo scanning

By

Apple.logo.paris.store
Some inside Apple aren't happy with the move.
Photo: Cult of Mac

Apple employees have begun voicing their concerns over the company’s plan to scan user photos for child abuse material (CSAM), according to a new report. Many are said to have taken to internal Slack channels to express worries over how the feature could be exploited by governments.

“More than 800 messages” have been shared on one channel during a “days-long” discussion about the move. It comes after a number of privacy advocates this week spoke out against Apple’s announcement, calling it mass surveillance and warning that it could set a dangerous precedent.

Apple looks to ease CSAM photo scanning concerns with new FAQ

By

Apple CSAM photo scanning
Clearing up the confusion.
Photo: Apple

Apple defends its plan to scan user photos for child sexual abuse imagery in a newly published FAQ that aims to quell growing concerns from privacy advocates.

The document provides “more clarity and transparency,” Apple said, after noting that “many stakeholders including privacy organizations and child safety organizations have expressed their support” for the move.

The FAQ explains the differences between child sexual abuse imagery scanning in iCloud and the new child-protection features coming to Apple’s Messages app. It also reassures users that Apple will not entertain government requests to expand the features.

Apple plans to scan iPhones and iCloud for child abuse imagery [Updated]

By

Apple reportedly will scan images in iPhones and iCloud for hints of child abuse.
Apple reportedly will scan images in iPhones and iCloud for hints of child abuse.
Photo: Kevin Dooley/Flickr CC

Apple plans to scan photos stored on peoples’ iPhones and in their iCloud accounts for imagery suggesting child abuse, according to news reports Thursday. The effort might aid in law-enforcement investigations, but also could invite controversial access to user data by government agencies.

Apple’s update to its web page “Expanded Protections for Children” — see under the “CSAM Detection” subheading — appears to make the scanning plan official. CSAM stands for “child sexual abuse material.”