Let’s say that you’re preparing to launch a website for a Fortune 500 company. You’ve performed a single test of the website using your desktop computer and Google Chrome. Are you ready to launch?
Of course not — if you haven’t tested your code with a variety of user agents and viewport sizes, you have no idea whether the content will be displayed properly. Testing is arguably the most important phase of product development.
Why, then, would you trust a single assistive technology (AT) test to verify whether your content is accessible?
Screen readers (software that outputs text as audio or braille) are essential AT for many web users who have vision disabilities. While digital accessibility isn’t solely focused on the experiences of screen reader users, screen reader testing is important for earning, maintaining, and proving digital compliance.
But just as web browsers have different features and capabilities, major screen readers like JAWS (Jobs Access With Speech) and NVDA (NonVisual Desktop Access) provide fundamentally different experiences for their users.
Practically speaking, screen readers are not user agents. They’re software that works on top of the web browser, interpreting data that the web browser provides in order to present the website to a user in an understandable way.
Popular screen readers like JAWS and NVDA use different techniques to interpret HTML, CSS (Cascading Style Sheets), JavaScript, and WAI-ARIA (Web Accessibility Initiative - Accessible Rich Internet Applications). Since web browsers also interpret markup differently, a screen reader user may hear something different when accessing the same website with different browsers.
Those differences can be important. In an analysis from website mapping tool PowerMapper, NVDA interpreted ARIA markup reliably in about 93% of tests when used with Mozilla FireFox.
However, NVDA only passed 83% of those tests when used with Google Chrome. Other screen readers such as JAWS and Apple Voiceover also had changing levels of reliability when used with different web browsers.
Output may also change depending on the user’s preferences and settings. For example, some users may prefer to hear less punctuation. Some may choose to hear detailed structural information about the elements on a page or inside a document, while others may prefer to hear as little of this information as possible.
Related: 5 Myths About Screen Readers That Can Hurt Accessibility
All of this means that if you access your website with a screen reader, you might encounter a bug that doesn’t exist when using a different screen reader or another web browser. That bug may need to be fixed — but maybe not. You’ll need to review your markup to determine whether or not you’ve made a mistake.
This can be frustrating for web developers with limited screen reader experience (and it’s frequently frustrating for screen reader users, too).
You can avoid serious screen reader accessibility issues by following the Web Content Accessibility Guidelines (WCAG) and WAI-ARIA authoring practices guide, both of which are published by the World Wide Web Consortium (W3C). By following these established standards, you can make decisions that improve the experiences of screen reader users — and maintain compliance with laws like the Americans with Disabilities Act (ADA).
Even so, regular screen reader testing is essential. If you don’t test your website thoroughly, you can’t find and fix the issues that affect your users.
Related: How Screen Readers Are Used in Accessibility Testing
A comprehensive accessibility testing strategy should include screen reader testing performed by people who have experience with tools like NVDA and JAWS. While web developers can (and should) build their familiarity with these tools, regular screen reader users may have a deeper contextual understanding of how AT works. They can also recommend the ideal fixes for each accessibility issue.
At the Bureau of Internet Accessibility, we utilize a four-point hybrid testing methodology that combines powerful artificial intelligence with manual tests performed by people with vision-related disabilities. Our testers hold certifications in JAWS and NVDA, and each expert has at least five years of experience in accessibility testing.
To learn more about our testing process, read about our WCAG 2.1 Level AA website audits or contact us to discuss your goals.