The infuriating inefficiency of accessibility audits
and what to do about it
Accessibility audits are the bread and butter of every accessibility consultancy. It’s an easy to package product that clients have learned to ask for and buy. They have expectations on the deliverables and the form of an audit. Audits are usually also thorough, following established international guidelines (usually the Web Content Accessibility Guidelines).
Note: I think this article applies to all kinds of accessibility audits, whether they are called “review” or “check” or “audit”.
Table of Contents
What’s this all about?
Audit reports can be very long with dozens of issues and how to solve them. They include simple failures like missing alternative texts or single unreachable buttons, as well as more fundamental issues like a color scheme that produces low-contrast text or UI elements that are difficult to make accessible.
An example of the latter is what I call “overloading UI elements”: Modern search fields that you see at the top of many websites, especially in e-commerce, are often overloaded. They look like a simple search field, but in reality are a button that opens a dialog with promotions and once you start typing it works like a suggestion combo box. Three UI elements looking like one different UI element.
Support Eric’s independent work
I'm a web accessibility professional who cares deeply about inclusion and an open web for everyone. I work with Axess Lab as an accessibility specialist. Previously, I worked with Knowbility, the World Wide Web Consortium, and Aktion Mensch. In this blog I publish my own thoughts and research about the web industry.
The cost of fixing (accessibility) bugs
In 2002, the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) published a report called “The Economic Impacts of Inadequate Infrastructure for Software Testing”1 . This NIST report looked into where testing would be most efficient to find and correct bugs during a software life cycle.
The findings and methodology of the report are interesting, and I would recommend taking a look at it. Table 5-1 is very illustrative of the problems of correcting errors late in development:
| Requirements Gathering and Analysis/ Architectural Design | Coding/Unit Test | Integration and Component/RAISE System Test | Early Customer Feedback/Beta Test Programs | Post-product Release |
|---|---|---|---|---|
| 1X | 5X | 10X | 15X | 30X |
Accessibility audits almost always happen after launch, as an afterthought. That means that errors that could have been found in the requirement analysis are 30 times harder to fix2 . This makes accessibility audits seem very inefficient.
To address this, accessibility people recommend shifting testing to the left. Every accessibility issue that is discovered early pays off many, many times. Even more so if you avoid the issue outright. This is what table 5-2 from the report demonstrates:
| Where Errors are Introduced | Where Errors are Found | ||||
|---|---|---|---|---|---|
| Requirements Gathering and Analysis/ Architectural Design | Coding/Unit Test | Integration and Component/ RAISE System Test | Early Customer Feedback/ Beta Test Programs | Post-product Release | |
| Requirements Gathering and Analysis/ Architectural Design | 1.0 | 5.0 | 10.0 | 15.0 | 30.0 |
| Coding/Unit Test | 1.0 | 10.0 | 20.0 | 30.0 | |
| Integration and Component/ RAISE System Test | 1.0 | 10.0 | 20.0 | ||
Now, of course, some teams are quicker remediating issues later in the life cycle, but other teams are not. Every team has to find that balance for themselves. My observation is that most teams go from “let’s fix the errors in this report” to “this is tedious, how can we avoid making mistakes in the first place?” really quickly.
Avoiding accessibility bugs in the first place
One of the frequent questions clients ask me is: “How would you address this issue?” In many instances, my response is that I would not have created this UI component/interaction/design in such an inaccessible way in the first place.
With the knowledge of the guidelines of accessibility in mind, a lot of the issues can be avoided by the decision to implement simpler, easier to code and test designs. That does not mean making designs and interactions less appealing, but taking the route of least resistance.
Some examples:
- Before the
<dialog>element was widely available, instead of using custom-made and therefore complex and error-prone dialogs, I often opted for simpler solutions, like expandable sections on the page. - Instead of hand-coding a custom select box for styling, I would be content with changing the appearance of a native one.
- Instead of dynamic error messages for forms, I’d let people submit the form and design the error messaging really well – with an overview list, the number of errors in the title, and links to the specific erroneous fields.
I understand that not everyone is in the position to make these decisions, to trade fidelity for speed. But it feels that in a glossy Figma-driven world, many are not aware of easier options.
It’s essential that web designers and developers learn how to create simple, straight-forward experiences that are not always fancy. The goal is not to make websites boring – a common misconception about accessible websites – but allow time to do the important, complicated interactions right.
Audit reports are often long and complicated
One of the most challenging tasks as an auditor is to communicate errors and fixes efficiently. I have made reviews that fit on a couple of pages and also in-depth audits of many web pages that spanned hundreds of report pages.
The reality is that the usefulness of an audit report is getting worse and worse after maybe the first dozen tested pages. What’s wrong there is wrong everywhere. These days, I test fewer and fewer pages if I can help it. This keeps audit reports shortish and also allows for easier remediation by clients.
I generally include:
- Essential and typical workflows: This could be “buying a thing in the shop and checking out” or “creating an account and interacting in a chat interface”, for example.
- Important pages: Pages where people land on from external sources, including the home page, legal information, accessibility information, contact forms.
I make clear that all issues are only examples and that I trust that the developers know their code best and find other instances. I usually also include consulting hours to work through open questions over time.
Keeping error descriptions brief and clear is also important. Most clients are uninterested in why something is broken, their goal is to fix it. Nevertheless, this is a great place to add some education into a report. Refer to the interactions of disabled people that are disturbed by the error and why it is important in one sentence or two.
Then give the most comprehensive solution that you can. It’s OK if that is “Change this implementation to follow this blog post3 ”. Or just a snippet of code. Make fixing errors not intimidating.
Remember that most designers and developers getting these reports are new to accessibility. That’s also the reason you have to explain the same concepts over and over again. Most web developers were not even in their teens when you taught alternative text in 1999.
Of course, these indirections lead to a heightened effort for accessibility: Auditors have to get access, understand the site, structure and conduct their testing, putting the results in an easy-to-understand document, and hand it over to explain to designers and engineers. And those people then have to translate it back for their needs.
Why not fix it for them?
Remediation as a service can certainly work in projects where no development team exists that regularly deploys updates. Smaller online shops come to mind. Sometimes embedding accessibility into larger organizations also helps to quicker translate the needs to fix accessibility internally.
But for the small project, a theme update or a new plugin can make the site inaccessible again. With larger projects, the risk is that people rely on the accessibility person’s expertise and not enough knowledge transfer happens to make accessibility stick.
In addition, the people working on the code already know it best. If they are not burdened by processes, they can often make fixes remarkably quick themselves.
There is no one-size-fits-all situation here. The approach that seems to work best for the teams I work with is a good review with details and then regular consulting check-ins to help them along the way. I have seen many teams getting self-sufficient really fast, learning along the way.
Support Eric’s independent work
I'm a web accessibility professional who cares deeply about inclusion and an open web for everyone. I work with Axess Lab as an accessibility specialist. Previously, I worked with Knowbility, the World Wide Web Consortium, and Aktion Mensch. In this blog I publish my own thoughts and research about the web industry.
What you can do to expedite audits
Attempt to fix the basic accessibility issues. While automation will not fix all your accessibility issues4 , a good number of technical errors fall into very easily detectable categories. Getting these top issues fixed will make a real difference to people. Think about hiring an accessibility person for a few hours to talk you through the results of your automated testing and suggest fixes. This can also help to guide you to the right path when your ideas are wrong or overly complicated.
Going through this process also helps to define what an audit might entail and makes good sample decisions easier.
It’s also important to not be minimalist when fixing issues. Don’t discuss what the minimum is that you need to do to just pass WCAG. When we recommend a best practice solution, implement it along the guidance. It’s easy to split hairs whether a Success Criterion is really failed or not, but that energy is often spent better elsewhere. The same goes for auditors as well. Try not to be overly pedantic, and be patient and compassionate when reporting issues. It helps nobody to be antagonistic. Resolving high-impact issues is more valuable than the long tail of little technical infractions.
- Find a PDF version of the report linked on the NIST website (probably not an “accessible” PDF). ↩
- Note that this is a “counterfactual” or hypothetical example. The report goes into detail to show that different companies in different circumstances have different factors here. The main point is that fixing any bug post-launch will be more costly than avoiding or fixing it beforehand. ↩
- Who are we kidding, it’s 99% a link to Adrian Roselli’s excellent blog! ↩
- and neither will “AI” ↩
Comments & Webmentions
Replies
@yatil it’s so easy to resort to nitpicking. So thank you for the reminder.