For over a decade, Apple’s Siri has provided people with new ways to interact with their digital devices. Siri is a voice assistant — by saying, “Hey, Siri,” users can issue commands, ask questions, and control some desktop and mobile applications.
Many people who live with disabilities say that Siri is a vital accessibility feature, and other brands have introduced competing voice assistants (notably, Google Assistant, Amazon Alexa, and Microsoft Cortana). These tools can have enormous benefits for people with a wide range of disabilities. For example:
- People with certain neurocognitive differences can ask questions without judgment. Asking questions like “what day is today?” or “what time is it?” can be difficult, and human responses aren’t always helpful; voice assistants provide accurate answers immediately.
- People can perform basic tasks quickly. If a person has trouble typing, Siri can save time by sending text messages, scheduling appointments, or ordering food.
- People can set automatic reminders. Voice assistants can tell users when to take medications, schedule physical therapy, or call their friends — these automatic reminders can be extremely useful for people with memory impairments.
But while Apple has consistently prioritized accessibility, Siri — and other voice assistants — aren’t perfect. Siri struggles to recognize some types of voices. For example, if a person speaks slowly due to a condition like Parkinson’s, the voice assistant might respond with an error message before the user finishes their sentence. Siri also speaks at a slow cadence, and people who regularly use screen readers may want to speed up the conversation.
While Siri improved experiences for many people, it was limited as an accessibility tool
Since 2011, Apple has introduced regular updates to improve Siri’s voice recognition and to expand its capabilities. Initially, the software had trouble understanding certain accents and voice commands — and while the tool improved accessibility for some users, it created new frustrations for others.
In an article for Mic, writer Jamison Hill, who has physical disabilities, notes that Siri changed the way he used his smartphone. He used the voice assistant to play music, call family members, and even to research his medical condition.
However, Hill has chronic fatigue syndrome, a multi-system disease that eventually affected his ability to speak. When his speech became limited, Siri stopped working.
“The problem: I can't say ‘Hey, Siri,’” Hill writes. “I can't, for that matter, say anything loudly enough for Siri, or any other virtual assistants, to register my voice. Every time I try to say something, Siri replies with ‘Sorry, I'm not sure what you said.’ And as sincere as I believe that sentiment to be, it doesn't solve my problem.”
In iOS 11, Apple updated Siri to expand support for texting. Since the update, users can activate Siri by typing — and users who cannot speak can enjoy the same access to the voice assistant.
Voice assistants continue to improve, but no technology can address the needs of every user
While Siri continues to improve, some disability advocates say that the tool’s new features are firmly focused on users outside the disabilities community. While Apple is a leader in digital accessibility, Siri fails to offer the same functionality as competing voice-activated assistants, including customization capabilities.
“The main purpose of a virtual assistant, as I see it, is to make the user's life easier,” Hill writes. “For me, that would mean knowing I can't speak and doing its best to accommodate my disability so I can still perform the tasks I need to complete each day. The perfect virtual assistant would know my patterns, the tasks I do every day, when I do them and that I can't vocalize my instructions.”
Ultimately, Siri is a useful tool — but like any assistive technology, there’s room for improvement. The limitations of voice assistants showcase one of the fundamental ideas of digital accessibility: No technology works perfectly for everyone.
To expand access to the widest possible audience, developers need to consider the real-life experiences of users at every stage. That might mean adding new features (such as Siri’s text support) to provide people with more options, or it might mean avoiding design decisions that rely on a certain type of sensory perception.
Apple's Siri helped to highlight the importance of digital accessibility
For people outside of the assistive technologies community, it’s important to remember that everyone has a shared responsibility to make digital products accessible. Even when assistive technologies work perfectly, users may encounter barriers when accessing mobile apps and websites that aren’t designed with the best practices of digital accessibility in mind.
And if you’re not sure whether digital accessibility is an important priority, just ask Siri — or, rather, ask Susan Bennett, who provided the voice of Siri. Bennett has advocated for digital accessibility.
“Accessibility is one of those things that we all need to care a little more about,” Bennett said in 2021. “There are millions of people in the world struggling to make it through the day because of their physical impairments. It’s up to all of us to make sure that doesn’t happen anymore.”