Why Accessibility Overlays Fall Short & What We Did Instead

Building an accessible product is an important goal fpr many software teams. While there are some automated tools like browser plug-ins, they can only identify 30-40% of accessibility issues. This means the remaining 60% of issues must be found through manual evaluation.

To address this gap, some companies offer “overlay accessibility solutions.” Promises vary, but some claim ADA compliance with just a few lines of code. These solutions often include widgets allowing users to make “accessibility adjustments” to what they’re viewing in the browser. Some of these companies even promise they’ll use AI to detect and fix issues automatically.

But as the saying goes: is it too good to be true? On a recent project, my team explored this question. With limited time, resources, and expertise, we needed to determine whether an overlay solution was the right choice. (Spoiler alert: It wasn’t.) Here’s what we did instead.

We evaluated the accessibility of our software.

Our first step was to assess our software’s current level of accessibility. From the start, our goal was to design and build to meet WCAG 2.0 AA standards, the benchmark for web accessibility. Even though we build with accessibility in mind, its complexity means some aspects can be overlooked in fast-moving projects. An evaluation helped us gauge compliance and identify the right accessibility remediations.

We started with a free plug-in called Wave that detects 30-40% of accessibility issues. We found only a handful of issues without any major red flags so we were pleased with the results.

Next, we manually assessed our site using the WCAG Compliance Checklist from the A11y Project. I copied the items into a shared Google Spreadsheet and evaluated each item to determine how well our software addressed it. We found that around 50% of the items on the checklist didn’t apply to our software because we didn’t use things like videos, audio or images. The other 50% contained small tweaks and some bigger lifts. The items with the biggest efforts were around keyboard focus states and screen readers.

Although a time consuming process, the evaluation was the only way for our team to fully understand the accessibility of our software. It revealed some issues but no major concerns or need for large-scale changes to meet WCAG AA standards.

We researched accessibility overlays.

With relative confidence in our software’s accessibility, we turned to online research to learn more about overlays and AI accessibility tools. While I’m not an accessibility expert, many in the field have thoroughly evaluated these tools, so we relied on their insights.

We learned that overlay tools are widely criticized as ineffective band-aids often masking deeper accessibility issues – and sometimes making them worse. Overlay tools frequently fail to comply with WCAG 2.0 AA, creating a false sense of compliance. This has even led to class action lawsuits against providers like Accessible.

Furthermore, these solutions often aren’t helpful to users with disabilities. The tools can interfere with assistive technology like screen readers. They also require users to adjust settings on every individual site, whereas many users already have favorite tools and browser settings that work universally across sites. Our research confirmed that companies promising all-in-one accessibility solutions were overpromising and underdelivering.

We developed an approach that improved accessibility at the core.

Unsurprisingly, we decided against using an overlay solution. Instead, we decided to improve accessibility at its core and address the issues found in the evaluation. We created stories for all issues found by the WAVE plugin and added them to our backlog. For the manual issues, I worked with the development team to estimate their effort. We added smaller tasks as stories to the backlog and kept a list of larger efforts requiring further planning.

Lastly, we leveraged our small user base to learn about their needs. One power user shared that she uses a screen reader. I set up a usability session where she navigated key functions of the software with her screen reader while we observed. Watching her encounter accessibility issues firsthand allowed us to document them as stories and prioritize them in the backlog. This session also provided team members with a deeper understanding of accessibility from a user’s perspective, equipping us with first-hand knowledge that will inform our approach in future projects.

Accessibility Overlays Fall Short

Overlay and AI-driven accessibility solutions promising quick fixes and guaranteeing WCAG compliance are tempting. While they may offer quick wins, they are not a substitute for inclusive design and development practices. Our research confirmed that truly accessible software requires native fixes, proper testing, and real user feedback. While we’re excited about the promise of AI, it was not mature enough to ensure the level of accessibility required to make our software usable for everyone.

Conversation

Join the conversation

Your email address will not be published. Required fields are marked *