Apple’s newly announced Personal Voice technology enables an iPhone to read text in the user’s own voice. The same tech could be used to read incoming text messages in the sender’s own voice, making them feel more personal.
This isn’t a theory — Apple submitted a patent for exactly this idea in early 2023.
Just can’t wait for WWDC23? Apparently, Apple can’t either because it’s already starting announcing new features that will almost certainly be in iOS 17, iPadOS 17 and macOS 14.
These are aimed at those with disabilities, and include Live Speech and Personal Voice. These will allow those with speech disabilities to participate in conversations in a synthesized voice that sounds like the user.
You can use a feature called Guided Access to lock down your iPhone to a single app before you hand it to a kid or someone else. You might want to let your offspring play a game, or pass your phone around for controlling music, or hand it off to show someone a video … but you probably don’t want them going rogue and reading your texts or calling your mom.
In Accessibility settings, you can enable Guided Access to limit your iPhone to a single app before you hand it off. It’s a kind of quick and dirty “guest mode.”
This will help you keep your phone — and your privacy — safe. You can even disable features like the volume buttons and set up time limits.
It’s easy to zoom in on your Mac display and get a closer look at your screen. If the text is just too small to read, or perhaps you’re making some graphics and you need pixel-perfect alignment, a simple tweak to your Mac settings is all you need.
Using your Mac’s Zoom feature, you can hit a keyboard shortcut or use a multitouch gesture to zoom in on your screen. I’ll show you how to use this handy feature. Plus, I’ll cover Hover Text and Display Scaling, two more features that help you embiggen the words on your Mac screen.
Working in an office or in the city, you’re probably inundated with noise from people chattering, cars running and nearby music. Your iPhone has a built-in feature called Background Sounds for playing rain noises or white noise to tune it all out.
You don’t need to download any apps or pay a cent; it comes for free on your Mac and iPhone. Let me show you how it works.
Live Captions, in iOS 16, generate subtitles of any audio playing in any app on your iPhone. Powered by the Neural Engine in Apple’s custom silicon, the capability to turn words from music and/or videos into real-time text is a boon to many users, in many different situations.
If you’re hard of hearing, for instance, the ability to see instant captions on the screen is a game changer. Or, if you don’t have headphones when you’re sitting in bed late at night and your partner is asleep – or you’re in any situation where you don’t want to make noise, like on the bus or in an office – you can turn on Live Captions to get subtitles.
The applications are endless and exciting. Here’s how to use Live Captions in iOS 16.
Apple’s next big thing might not be a car or an AR headset. Thanks to a rule change announced this week by the Food and Drug Administration, Cupertino could soon add hearing aids to its product lineup. The potential market is huge, and Apple stands uniquely positioned to disrupt the status quo.
The new rules allow companies like Apple to sell hearing aids over the counter and online, so buyers can set them up in the comfort of their own homes. Previously, if you wanted to buy hearing aids, your only option was to make an appointment for a hearing test and fitting at a specialist store.
Apple’s Door Detection uses advancements in hardware, software, and machine learning to help people who are blind or low vision use their iPhone and iPad to navigate the last few feet to their destination.
This is one of several innovative software features unveiled Tuesday with new ways for users with disabilities. These include Live Captions, Apple Watch Mirroring and more.
The other day I was walking with music blasting through my AirPods when I almost stepped in front of a speeding ambulance.
Luckily, the magical Sound Recognition feature on my iPhone was turned on and my AirPods recognized the wailing sirens. They silenced the music and piped the sirens into my ears instead, saving my bacon. It was amazing and quite magical.
Your iPhone can also listen and alert you for crying babies, running water, knocks on the door, barking dogs and more.
The iPhone is renowned for its many accessibility features. Accessibility settings can make text on the screen bigger, buttons easier to identify, animations less jarring and sound easier to hear.
An accessibility feature that is useful for everyone is Spoken Content. You can have your phone read out loud anything you have on-screen. This feature was designed for people who have trouble reading small text, but you will find it handy even if you don’t — in lots of situations.
You can have recipes read to you while your hands are busy cooking, quickly hear how to pronounce a word you don’t know — that’s what I use it for most of all — and more. You can even hear what you’re typing as you write.