AssistiveTouch lets users control Apple Watch by clenching a fist

By

AssistiveTouch lets users control Apple Watch by clenching their fists.
AssistiveTouch lets users control Apple Watch by clenching their fists.
Photo: Apple

Apple plans to release software updates this year that will make its devices far easier to use for people with mobility, vision, hearing and cognitive disabilities.

The features include AssistiveTouch for Apple Watch, which offers astonishing new ways for people with limited mobility to control the smartwatch without tapping its screen. The new feature uses Apple Watch’s array of sensors to interpret the wearer’s movement into interactions.

Cupertino showcased AssistiveTouch for Apple Watch — which lets users maneuver a cursor on the wearable’s screen simply by clenching their fist and pinching their fingers together, among other things — in a remarkable video. (We embedded the video below — definitely watch it.)

But AssistiveTouch for Apple Watch is just the beginning of Apple’s latest big push into accessibility.

“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make,” said Sarah Herrlinger, Apple’s senior director of global accessibility policy and initiatives, in a press release Wednesday. “With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can’t wait to share them with our users.”

The new features unveiled Wednesday join AssistiveTouch (previously available on iPhone and iPad), VoiceOver and other innovations designed to help people with disabilities get the most out of their Apple devices — and their lives. Apple showcases these features on the recently revamped Accessibility section of its website.

AssistiveTouch for Apple Watch

Apple described the wild new AssistiveTouch features coming to Apple Watch like this:

To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls. Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.

And here’s that astonishing video that showcases the new Apple Watch AssistiveTouch interactions:

The video provides a vivid reminder of Apple’s enduring knack for coming up with novel user interfaces.

Other new assistive tech from Apple

SignTime is just one of the new features for deaf and hard of hearing people.
SignTime is just one of the new features for deaf and hard of hearing people.
Screenshots: Apple

SignTime and bi-directional hearing aid support

For the deaf and hard of hearing, the Made for iPhone certification will add support for bi-directional hearing aids.

“The microphones in these new hearing aids enable those who are deaf or hard of hearing to have hands-free phone and FaceTime conversations,” Apple said.

And sign language users can take advantage of a new service launching Thursday called SignTime.

“This enables customers to communicate with AppleCare and Retail Customer Care by using American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK, or French Sign Language (LSF) in France, right in their web browsers,” the company said. “Customers visiting Apple Store locations can also use SignTime to remotely access a sign language interpreter without booking ahead of time. SignTime will initially launch in the US, UK, and France, with plans to expand to additional countries in the future.”

Apple also will add support for recognizing audiograms, the charts generated by hearing tests, to its accessibility features for AirPods and Beats headphones.

“Users can quickly customize their audio with their latest hearing test results imported from a paper or PDF audiogram,” Apple said. “Headphone Accommodations amplify soft sounds and adjust certain frequencies to suit a user’s hearing.”

iPad eye-tracking support

Other new accessibility features in the pipeline include iPad support for third-party hardware that allows control of the tablet by tracking the user’s eyes.

“Later this year, compatible MFi devices will track where a person is looking onscreen and the pointer will move to follow the person’s gaze, while extended eye contact performs an action, like a tap,” Apple said.

VoiceOver upgrade

Apple’s VoiceOver screen reader for blind and low vision users “will get even smarter using on-device intelligence to explore objects within images,” the company said.

Building on recent updates that brought Image Descriptions to VoiceOver, users can now explore even more details about the people, text, table data, and other objects within images,” Apple said. “Users can navigate a photo of a receipt like a table: by row and column, complete with table headers. VoiceOver can also describe a person’s position along with other objects within images — so people can relive memories in detail, and with Markup, users can add their own image descriptions to personalize family photos.”

And “in support of neurodiversity,” Apple will add new background sounds to iPhone and other devices “to help minimize distractions.” This could help users with cognitive disabilities “focus, stay calm, or rest.”

“Balanced, bright, or dark noise, as well as ocean, rain, or stream sounds continuously play in the background to mask unwanted environmental or external noise, and the sounds mix into or duck under other audio and system sounds,” Apple said.

A new Accessibility Assistant Shortcut will help people “discover Apple’s built-in features and resources for personalizing them.”

New Memojis will depict people with cochlear implants, supplemental oxygen and protective helmets.
New Memojis will depict people with cochlear implants, supplemental oxygen and protective helmets.
Images: Apple

More for Global Accessibility Awareness Day

Apple is also launching “new features, sessions, curated collections, and more” to celebrate Global Accessibility Awareness Day this Thursday.

The rollout includes highlighting content in the App Store, Apple Fitness+, Apple Books and the Apple TV app.

Plus, the Today at Apple program will offer “live, virtual sessions in ASL and BSL throughout the day on May 20 that teach the basics of iPhone and iPad for people with disabilities. In some regions, Today at Apple will offer increased availability of Accessibility sessions in stores, through May 30,” Apple said.

Source: Apple