Apple somehow created the world’s greatest and most disappointing tablet operating system. iPadOS is by far the best you’ll find for larger touchscreens, and yet, it leaves us wanting so much more.
This year’s iPadOS 15 release is an incremental upgrade over its predecessor. It improves upon the split-screen multitasking system, adds some new features like Focus mode, and finally allows us to put widgets anywhere.
But it’s still iPadOS as we know it, and it’s still holding back iPad Pro. We could be doing so much more with the hardware, especially now that the newest models pack even-speedier M1 chips. But Apple won’t let us.
Here’s our full iPadOS 15 review. It lays out what’s good about the new operating system — and explains why we think it’s time for a little more ambition.
The Electronic Frontier Foundation (EFF) has called on Apple to completely abandon its child safety features after their rollout was delayed.
The group says it is “pleased” Apple’s move is on hold for now. But it calls the plans, which include scanning user images for child abuse material (CSAM), “a decrease in privacy for all iCloud Photos users.”
The EFF’s petition against Apple’s original announcement now contains more than 25,000 signatures. Another, started by groups like Fight for the Future and OpenMedia, contains more than 50,000.
iCloud+ subscribers can now start using custom domain names with iCloud Mail. The new feature, announced at WWDC 2021 alongside iOS and iPadOS 15, just rolled out in beta with support for up to five custom domains.
Apple has confirmed that it started detecting CSAM using image matching technology in iCloud Mail back in 2019. It says that accounts with CSAM content violate its terms and conditions and will be disabled.
Apple has rolled out a big update to its iCloud for Windows app, finally bringing an iCloud Passwords app to Microsoft’s operating system. The version 12.5 update includes extensions that let you access your passwords in a browser.
Apple employees have begun voicing their concerns over the company’s plan to scan user photos for child abuse material (CSAM), according to a new report. Many are said to have taken to internal Slack channels to express worries over how the feature could be exploited by governments.
“More than 800 messages” have been shared on one channel during a “days-long” discussion about the move. It comes after a number of privacy advocates this week spoke out against Apple’s announcement, calling it mass surveillance and warning that it could set a dangerous precedent.
Apple defends its plan to scan user photos for child sexual abuse imagery in a newly published FAQ that aims to quell growing concerns from privacy advocates.
The document provides “more clarity and transparency,” Apple said, after noting that “many stakeholders including privacy organizations and child safety organizations have expressed their support” for the move.
The FAQ explains the differences between child sexual abuse imagery scanning in iCloud and the new child-protection features coming to Apple’s Messages app. It also reassures users that Apple will not entertain government requests to expand the features.
Apple plans to scan photos stored on peoples’ iPhones and in their iCloud accounts for imagery suggesting child abuse, according to news reports Thursday. The effort might aid in law-enforcement investigations, but also could invite controversial access to user data by government agencies.
Apple’s update to its web page “Expanded Protections for Children” — see under the “CSAM Detection” subheading — appears to make the scanning plan official. CSAM stands for “child sexual abuse material.”