Apple previews new accessibility features: Door Detection, Apple Watch Mirroring, and Live Captions
Apple's latest innovative software features introduce new ways for users with disabilities to navigate, connect, and get the most out of Apple products. These powerful updates combine the company’s latest technologies to deliver unique and customisable tools for users, and build on Apple’s long-standing commitment to making products that work for everyone.
Door Detection for Users Who Are Blind or Low Vision
Door Detection is made for users who are blind or low vision and can them to locate a door upon arriving at a new destination, understand how far they are from it, and describe door attributes — including if it is open or closed, and when it’s closed, whether it can be opened by pushing, turning a knob, or pulling a handle. Door Detection can also read signs and symbols around the door, like the room number at an office, or the presence of an accessible entrance symbol. This new feature combines the power of LiDAR, camera, and on-device machine learning, and will be available on iPhone and iPad models with the LiDAR Scanner.
Door Detection will be available in a new Detection Mode within Magnifier, Apple’s built-in app supporting blind and low vision users. Door Detection, along with People Detection and Image Descriptions, can each be used alone or simultaneously in Detection Mode, offering users with vision disabilities a go-to place with customisable tools to help navigate and access rich descriptions of their surroundings. In addition to navigation tools within Magnifier, Apple Maps will offer sound and haptics feedback for VoiceOver users to identify the starting point for walking directions.
Advancing Physical and Motor Accessibility for Apple Watch
For people with physical and motor disabilities, Apple Watch Mirroring helps them control Apple Watch remotely from their paired iPhone. With Apple Watch Mirroring, users can control Apple Watch using iPhone’s assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display. Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and more.
Plus, users can do even more with simple hand gestures to control Apple Watch. With new Quick Actions on Apple Watch, a double-pinch gesture can answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout. This builds on the technology used in AssistiveTouch on Apple Watch, which gives users with upper body limb differences the option to control Apple Watch with gestures like a pinch or a clench without having to tap the display.
Live Captions Come to iPhone, iPad, and Mac for Deaf and Hard of Hearing Users
For the Deaf and hard of hearing community, Apple is introducing Live Captions on iPhone, iPad, and Mac. Users can follow along more easily with any audio content — whether they are on a phone or FaceTime call, using a video conferencing or social media app, streaming media content, or having a conversation with someone next to them.
Users can also adjust font size for ease of reading. Live Captions in FaceTime attribute auto-transcribed dialogue to call participants, so group video calls become even more convenient for users with hearing disabilities. When Live Captions are used for calls on Mac, users have the option to type a response and have it spoken aloud in real time to others who are part of the conversation. And because Live Captions are generated on device, user information stays private and secure.