Apple showcases amazing new accessibility features like Eye Tracking


Apple accessibility features
With Eye Tracking, a user can navigate iPhone or iPad using just their eyes.
Photo: Apple

Apple showcased some remarkable new accessibility features for people with disabilities Wednesday, including Eye Tracking, Music Haptics and Vocal Shortcuts.

The new features, coming later this year to Apple devices, harness Apple silicon, artificial intelligence and machine learning. They will come mainly to iPhone and iPad, though some new ones will appear in Vision Pro’s visionOS, too.

“We believe deeply in the transformative power of innovation to enrich lives,” said Apple CEO Tim Cook in a press release. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

New Apple accessibility features include Eye Tracking, Music Haptics and Vocal Shortcuts

Apple highlighted the upcoming accessibility features including Eye Tracking, Music Haptics and Vocal Shortcuts ahead of Thursday’s Global Accessibility Awareness Day.

Eye Tracking, currently associated with the Vision Pro headset, is a way for users with physical disabilities to control iPad or iPhone with their eyes. Music Haptics offers a new way for deaf and hearing-impaired users to experience music using the Taptic Engine in iPhone. And Vocal Shortcuts lets users perform tasks by making a custom sound.

In addition, Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle, Apple said. And more accessibility features will come to visionOS.

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

 iPad and iPhone will get Eye Tracking feature

Apple accessibility features include Eye Tracking
The new Eye Tracking feature should be pretty amazing.
Photo: Apple

Imagine navigating your iPad or iPhone with just your eyes. That’s what Eye Tracking enables. Powered by AI, it uses the front-facing camera for easy set up and calibration. And on-device machine learning means any information used in setup isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps and needs no other hardware or accessories. It lets users “navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes and other gestures solely with their eyes,” Apple said.

Music Haptics makes songs more accessible


Music Haptics brings a new way for users who are deaf or hard of hearing to experience music on iPhone. Here’s Apple’s description:

With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.

New speech features including Vocal Shortcuts

Apple accessibility features - Vocal Shortcuts
Vocal Shortcuts help you launch tasks by making custom sounds.
Photo: Apple

Vocal Shortcuts lets users assign “custom utterances” on iPhone and iPad that Siri can understand to launch shortcuts and complete tasks.

It’s enhanced by another new feature, Listen for Atypical Speech:

Listen for Atypical Speech … gives users an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” said Mark Hasegawa-Johnson, principal investigator for the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign.

“The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible,” he added.

Vehicle Motion Cues may reduce motion sickness


If you get carsick, you know a drag it can be. Apple means to curb passengers’ motion sickness with Vehicle Motion Cues, a new experience for iPhone and iPad.

Here’s the description of how it could work:

Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone, or can be turned on and off in Control Center.

CarPlay Voice Control and other updates

Apple accessibility features - CarPlay
CarPlay adds Sounds Recognition, which alerts drivers to sounds like sirens.
Photo: Apple

Accessibility features are also coming to CarPlay:

  • Voice Control. It lets users navigate CarPlay and control apps with just their voice.
  • Sound Recognition. Drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens.
  • Color Filters. They make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.

Accessibility features coming to visionOS

Live Captions in visionOS
visionOS features Live Captions for deaf and hard-of-hearing folks.
Photo: Apple

Apple also pointed out new accessibility features in the Vision Pro headset’s visionOS.

“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant and cofounder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

Here are new accessibility features in visionOS:

Live Captions in visionOS “help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps,” Apple said.

With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona.

Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors.

Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

The Live Captions experience in visionOS is shown from an Apple Vision Pro user’s point of view.

visionOS will offer Live Captions, so users who are deaf or hard of hearing can follow along with spoken dialogue in live conversations and in audio from apps.

Read more about additional accessibility updates and find out how you can help Apple celebrate Global Accessibility Awareness Day.

Source: Apple



Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.