Apple officially released the newest public version of its flagship mobile operating system, iOS 16, on September 12, 2022. The software implemented a number of enhancements for iPhone owners — with a strong emphasis on accessibility.
For Apple, this isn’t a new approach: The company has focused on building its brand presence as an accessibility leader, and we’ve written other articles about the value of Apple’s inclusive messaging.
However, the additions in iOS 16 are still notable, as they might provide perspective on the future of assistive technology (AT). Here’s an overview of three major improvements.
1. iOS 16’s Door Detection makes navigation more intuitive
According to Apple, Door Detection uses a combination of LiDar (Light Detection and Ranging) and the user’s camera to identify and describe doors, including whether the door is open or closed. If the door is closed, Door Detection indicates whether it can be opened by turning a knob or pushing.
“Door Detection can help users locate a door upon arriving at a new destination, understand how far they are from it, and describe door attributes,” Apple explained in a press release. "[...] Door Detection, along with People Detection and Image Descriptions, can each be used alone or simultaneously in Detection Mode, offering users with vision disabilities a go-to place with customizable tools to help navigate and access rich descriptions of their surroundings.”
The feature is available in the brand-new iOS Detection Mode within Magnifier, a built-in app intended to support users with vision-related disabilities. Because it relies on LiDar, it is only available on iPhone and iPad models with the LiDAR scanner.
Related: Why It's Important That Apple's Software Updates Were 'Designed for People with Disabilities'
2. Apple Live Captions convert any audio content to text
iOS 16 finally matches the live caption capabilities of Google’s Android operating system. Live captions convert any audio content to text — including audio from phone calls, streaming media content, FaceTime, and more.
“Live Captions in FaceTime attribute auto-transcribed dialogue to call participants, so group video calls become even more convenient for users with hearing disabilities,” the company wrote.
“When Live Captions are used for calls on Mac, users have the option to type a response and have it spoken aloud in real time to others who are part of the conversation. And because Live Captions are generated on device, user information stays private and secure.”
Automatic captions are useful, but no substitute for pre-written captions
Of course, automated captions and transcriptions aren’t perfect: One study found that YouTube’s automatic captions are only about 60-70% accurate, and background noise can make the issue significantly worse.
We tried Apple’s Live Captions and found that they worked well, but no automated tool can convert natural speech with 100% accuracy. For pre-recorded media, creators still have an obligation to provide accurate captions and transcripts for media content. By doing so, businesses can fulfill an important requirement of the Web Content Accessibility Guidelines (WCAG), the international standards for digital accessibility.
With that said, live captions are still an important feature for many Deaf people and individuals with low hearing. And like many accessibility improvements, automatic captions can benefit everyone — if you’ve ever needed to take a call on a crowded subway without your AirPods, you can understand the appeal.
Related: Apple’s Siri Changed Accessibility, But No Voice Assistant is Perfect
3. iOS 16 improves accessibility features for the Apple Watch
Earlier this year, Apple improved the capabilities of the Apple Watch by introducing new customizable gesture controls. iOS 16 expands on these features with Apple Watch Mirroring, which allows users to control their watch remotely from a paired iPhone.
“With Apple Watch Mirroring, users can control Apple Watch using iPhone’s assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display,” the company wrote.
When enabled, Watch Mirroring presents a high-resolution AirPlay view of the Apple Watch display, which may be more intuitive for people with mobility or vision-related disabilities.
“Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and more.”
Related: Don't Overlook iOS and Android Testing for Accessibility
As assistive tech improves, every organization should make a commitment to accessibility
Apple’s approach demonstrates the enormous benefits of an accessibility-first mindset: When accessibility is a core principle, brands showcase their values — and users pay attention.
But while the iOS 16 accessibility features are excellent improvements, mobile operating systems rely on well-designed content to function predictably. Put simply, if your website has low-contrast text, missing text alternatives, or other accessibility issues, you’re introducing barriers for people with disabilities (even if they’re using the latest iPhone).
By incorporating the principles of inclusive design early in the product development lifecycle, you can reach a wider audience and provide all users with a more consistent experience.
To learn more, download our free eBook, Developing the Accessibility Mindset.