If you’re building a new web accessibility testing policy, you’ll need to determine how you test your content.
The standards for digital accessibility are well established: The World Wide Web Consortium (W3C) publishes Web Content Accessibility Guidelines (WCAG), which have simple statements called success criteria for auditing websites, mobile apps, and other digital products. You can simply read those criteria, then ask whether they’re true.
But if you’re auditing a large website, manually testing your content against each success criterion may be impractical. Automated accessibility testing provides a convenient solution: By using artificial intelligence (A.I.), automated tools can instantly find many of the barriers that affect users with disabilities.
Of course, all automated tools have their limitations. Below, we’ll discuss those limitations — and provide tips for establishing a sustainable, accurate accessibility testing policy.
Automated accessibility tests work well for accessibility issues with simple pass-or-fail rulesets
If an accessibility issue can be defined as “true" or “false,” it’s a good candidate for automated testing.
For example, WCAG requires that the visual presentation of text and images of text maintain a color contrast ratio of at least 4.5:1. This ensures that people with color vision deficiencies (also called color blindness) and other vision disabilities can read text against its background.
An automated tool can easily determine the color of the text and the color of the background, then compare the contrast ratio against WCAG standards. It’s a simple pass-or-fail ruleset — either the text conforms with WCAG, or it doesn’t.
Other examples of accessibility barriers with simple pass-or-fail rulesets include:
- Page structure issues, which can impact site navigability.
- Issues with responsive design (or “reflow"), which may make a site unusable with certain assistive technologies.
- Missing alternative text (also called alt text) for images and other non-text content, which impacts users with vision disabilities.
- Missing captions for videos, which impacts users with hearing disabilities.
- Extremely small pointer targets (such as buttons), which may be difficult for users to activate.
This isn’t a comprehensive list. AudioEye estimates that 70% of common accessibility issues can be detected with automation — and about two-thirds of those issues can be resolved automatically.
But remember, the goal is to address all barriers that could impact users. To conform with WCAG (and comply with non-discrimination laws), you shouldn’t rely on automated accessibility testing alone.
When accessibility criteria require human judgment, manual testing is essential
You’re building a website for human users — not for machines. Many accessibility issues require subjective decisions, and at this point, A.I. is not capable of subjective judgment.
Automated testing can (and should) be employed as part of a broader accessibility audit. But currently, automation has limits:
- WCAG Success Criterion (SC) 2.4.2 requires that web pages have an accurate title. Automated tools can instantly determine whether a page has a title tag, but they may not be able to determine whether that title describes the topic or purpose of the page.
- WCAG SC 1.1.1 requires that alternative text “serves [an] equivalent purpose.” Automation can determine whether alternative text exists, but not whether it’s an accurate description of an image’s purpose on the page.
- WCAG SC 2.4.4 requires that the purpose of each hyperlink can be determined from its link text alone. Automated tools can determine whether link text exists and whether it’s a duplicate of another instance of link text, but not whether the link text is accurate.
Once again, this isn’t a comprehensive list. Automated testing is a fantastic resource, and the technology continues to improve — but it’s not perfect. To use it effectively, you’ll need to carefully review the results of automated reports (and in most cases, you’ll need to perform additional manual tests).
Related: Web Accessibility Remediation: A Quick Guide for Getting Started
Automated accessibility testing plays an essential role in digital compliance
Manual testing is a more precise approach for identifying accessibility issues. However, it’s not always practical.
Modern websites often feature dynamic or frequently updated content. If you’re running a large eCommerce website, you might add dozens of pages a day or update product descriptions automatically. If you have a complex web application, testing each page may be impossible.
In those cases, automated testing is crucial. And even if you have a relatively simple website, automation can save you a lot of time — provided that you’ve got a process for manual remediations.
When building your testing strategy, keep these tips in mind:
- Choose your accessibility testing tools carefully. Look for tools that test content against the latest version of WCAG and read about their methodology.
- Don’t wait to test your content. Testing during development can reduce the long-term cost of remediations. Read: How Accessibility in the Web Development Process Saves Time.
- Before fixing any accessibility issue, make sure you understand how it affects people with disabilities. An accessibility-first mindset will help you make remediations that improve the experiences of all users.
- Consider working with a qualified accessibility partner to build a manual testing strategy. Manual tests should be performed by experts who have experience with screen readers and other assistive technologies.
If you’re ready to build a digital compliance strategy, we’re here to help. Get started with a free automated website accessibility analysis or send us a message to connect with an expert.