One of many hidden new features in iOS 14 is an option to set new shortcuts that are activated by tapping the back of your iPhone. It’s a new accessibility option that can be used for things like returning to the Home screen, snapping a screenshot, muting your device, and more. Here’s how it works.
iOS 14 and iPadOS 14 has an impressive accessibility feature that can listen out for sounds like running water, a person knocking on the door, smoke alarms, babies crying, and more — and then warn users about it with an on-screen notification.
It’s an incredibly smart feature, based on machine learning technology, that could range from useful to life-saving. Who says that always-listening tech has to be limited to “Hey, Siri”?
Thanks to unprecedented early leaks, some of the biggest new features planned for iOS 14 have already been spoiled. Apple is supposedly making some huge changes to the Home screen, iMessages, HomeKit, Apple Pencil and much more in its next-gen mobile operating system.
The recent wave of leaks proved so overwhelming that we rounded them all up in one place. We will keep updating the list as we inch closer to this summer’s Worldwide Developers Conference, where Apple traditionally previews all of its upcoming platform updates.
Did you know you can control your iPad using just a keyboard? You can use the arrow keys to move between icons on the Home screen. You can use the arrow keys (again) to scroll lists. And you can even tap and toggle buttons using the space bar. Apple added this capability via iOS 13.4’s new Full Keyboard Access feature, and it’s wild.
How wild? How about offering system-wide, custom keyboard shortcuts for running actual Shortcuts? And that’s just the beginning.
On the Mac, hot corners are essential — and amazingly useful. You can put your display to sleep, trigger Mission Control and more, just by flicking the mouse to a screen corner. If you’re one of those people who likes to use a mouse with your iPad, you can utilize these same flick-to-activate gestures on the tablet. And there’s a bonus: Hot corners on the iPad are way, way more powerful than on the Mac.
I prefer the Mac’s trackpad to a mouse in every way but one. It’s more comfortable, it relieves RSI, it can be used equally easily by the left or right hand, and it does scrolling and multitouch. But the one thing it’s terrible at is actually clicking. Specifically, clicking and dragging to move a window, or to make a selection. And I’m still using the original Magic Trackpad, the one that runs on AA batteries. It has physical switches in its feet, so clicking is a lot harder at its top edge.
Enter the three-finger drag. This Mac accessibility setting lets you tap with three fingers to simulate a click and drag. And it does a lot more than just making it easier to move windows around the screen.
For most people, tap and swipe-based gestures are the perfect way to navigate on an iPad. That’s not true for everyone, however. This is why the makers of a new eye-tracking system called Skyle have developed this innovative iPad Pro accessory.
Built with the accessibility audience in mind, the system lets users insert their 12.9-inch iPad Pro into a smart protective case, plug in an eye tracker, and then use a special Skyle app to navigate their iPad with nothing more than well-placed glances.
By now, you already know how to customize the regular stuff on your AirPods and AirPods Pro. You just find them in the list of connected Bluetooth gadgets, and tap the i button to see a list of handy settings. But what about deeper-level customization? Like most things in iOS, there’s an extra set of advanced AirPods Pro settings inside the Accessibility settings. You even change double-squeeze speed of the AirPods Pro stems if you want to slow things down.