Apple's new accessibility features are set to make technology more inclusive for users with disabilities.
The Cupertino-based company says that these accessibility updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone. Let’s take a look at what these features are.
new accessibility features are headlined by Assistive Access. “It distills apps and experiences to their essential features in order to lighten cognitive load,” says Apple. It also includes a customised experience for Phone and FaceTime, which have been combined into a single Calls app, as well as Messages, Camera, Photos, and Music. The feature offers a distinct interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support. Users can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.
Live Speech is another significant new feature. With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during conversations with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time. Users can also create a Personal Voice by reading along with a randomised set of text prompts to record 15 minutes of audio on iPhone or ipad.
The Detection Mode in Magnifier has introduced Point and Speak for users who are blind or have Low Vision, allowing them to interact with physical objects that have multiple text labels. For example, while using a household appliance such as a microwave, Point and Speak combines input from the Camera app, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad. Point and Speak is built into the Magnifier app on iPhone and iPad and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment.