Apple develops Door Detection for blind or low vision iPhone users | Cult of Mac

Apple develops Door Detection for blind or low vision iPhone users

By

Apple develops Door Detection for blind or low vision iPhone users
Door Detection can find and verbally describe a door to an iPhone or iPad user.
Screenshot: Apple

Apple’s Door Detection uses advancements in hardware, software, and machine learning to help people who are blind or low vision use their iPhone and iPad to navigate the last few feet to their destination.

This is one of several innovative software features unveiled Tuesday with new ways for users with disabilities. These include Live Captions, Apple Watch Mirroring and more.

Apple is committed to accessibility

“Apple embeds accessibility into every aspect of our work, and we are committed to designing the best products and services for everyone,” said Sarah Herrlinger, Apple’s senior director of Accessibility Policy and Initiatives. “We’re excited to introduce these new features, which combine innovation and creativity from teams across Apple to give users more options to use our products in ways that best suit their needs and lives.”

The announcements are part of Apple’s observation of Global Accessibility Awareness Day on May 19.

Find your destination with Door Detection

Apple Door Detection demo
Finding the way into a retail store is easier for blind or low vision iPhone users with Door Detection.

Apple says its Door Detection feature can help users locate a door, tell the user how far they are from it, and even describe door.

That includes whether it is open or closed. If closed, if it needs to be opened by pushing, turning a knob, or pulling a handle.

Door Detection can also read signs and symbols around the door, like the room number at an office.

The upcoming feature uses LiDAR and the mobile device’s camera, and will be available on iPhone and iPad models with a built-in LiDAR Scanner.

Door Detection will be part of a new Detection Mode within the Magnifier app, along with People Detection and Image Descriptions.

Live Captions come to iPhone, iPad and Mac

Apple is introducing Live Captions to a range of its devices. The goal is for Deaf and hard of hearing users to follow any audio content more easily, whether it’s a phone or FaceTime call, streaming media content or a conversation with someone next to them.

Captions will be generated on the device, so they stay private.

Apple Watch Mirroring

Users can control an Apple Watch remotely from their paired iPhone with Apple Watch Mirroring. Users can control the wearable with iPhone’s assistive features like voice commands, sound actions or head tracking as alternatives to tapping the Apple Watch display.

“Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and more,” according to Apple.

More languages for VoiceOver

VoiceOver, Apple’s screen reader, is adding support for more than 20 additional locales and languages, including Bengali, Bulgarian, Catalan, Ukrainian and Vietnamese. Users can also select from dozens of new voices.

These new languages, locales, and voices will also be available for Speak Selection and Speak Screen.

Source: Apple