Cost of False Positives
The disruption caused by non-actionable information (false positives) during accessibility testing can wreak havoc on your IT organization, timeline, and budget. Use the calculator below to calculate your potential false positive loss. Follow the directions below if you need guidance populating Variables.
- Wasted Full Time Hours
- Wasted Dollars Per Page
- Total Wasted Dollars
How to use the Calculator
1. Average Developer Cost
Enter the average hourly cost of your Developer. Tip: divide the 2,080 work hours in a year by the average yearly salary if you don’t know your average hourly rate. Example: If you’re average Developer Salary is $104K/year, enter “50.”
2. Average Accessibility Defects per page
If you don’t already have this information, try running a few single page tests using the free axe browser extension (Chrome, Firefox). Axe will NOT include false positives like many other tools, but this can help get you in the ballpark for this exercise. Example: 5 common pages with 100 total accessibility issues is 20 average defects.
3. Team members in triage
This is typically the handful of individuals affected by troubleshooting a defect. Example: 2 from QA, 2 Developers plus 1 Manager is 5 members.
4. Average False Positive Rate
Industry accessibility experts agree that the average false positive rate of free tools is about 20%.
5. Pages in Scope
Enter the number of pages you intend to test for accessibility, this often includes a range somewhere between the number of pages that include site core functions (like cart-checkout functions) and all pages.
Evaluate the wasted time and resources against the cost of your testing tools.
Where do accessibility false positives come from?
You may wonder why so many tools struggle to provide clear and consistent accessibility reports. A violation is a violation, right? Well, yes… and no. False positives and other noise in accessibility reporting occurs for several reasons, including:
- Automation can only partially detect certain accessibility issues, and many accessibility tools will include these partially detected issues (that may or may not actually be accessibility violations) just to be on the safe side.
- The accessibility tool may conflate accessibility best practices with violations and present them without making any distinction between the two.
- The rules written into the tool for detecting accessibility violations may be faulty or based on an unconventional interpretation of how certain accessibility violations are defined.
- The tool doesn’t provide any ways to limit the scope of the testing they will perform to avoid duplicating accessibility issues from components that get reused on the page or site.