To support the efforts of #a11ychat, we've asked each one of our regular (or will be regular, as we move forward with monthly chats) hosts/ moderators to write a blog post after each chat. The post can be a direct response to the conversation, a wrap-up, or a post about the chat topic. Our first chat was held on May 14th, hosted by Glenda Sims (@goodwitch) and Karl Groves (@karlgroves). The subject was accessibility testing tools. In this post, Elle shares her thoughts about accessibility testing and her favorite tools. This blog was originally posted at ElleWaters.com.
When people discuss accessibility testing, a singular image comes to mind for me. I always see this hermit-like solitary figure, hunched over and straining in the dim light of a computer screen, working with a complex series of buttons, levers, and keystrokes to reveal the inner clockworks of a web page, peeling back layers of code and examining it meticulously for deficiencies. A large open book sits by the keyboard, tiny letters scrawled across yellowed pages. After careful attention, turning over each nut and bolt of the page underneath a giant magnifying glass, the tester assigns a value: Pass or Fail.
I confess that some of this arcane imagery formed years ago when I was new to web accessibility, and the learning curve for assessments seemed insurmountable to someone just getting familiar with the terminology. I was often frustrated by the seemingly impenetrable, cryptic nature of web accessibility. I didn't understand how someone could really comprehend each and every success criteria, especially how they worked in combination with one another, distilling it all down to such a binary assessment. As with most people who begin this journey, both the standards and the testing methodology to assess those standards were new and very confusing to me. I would lose myself for a week trying to decode general flash and red flash thresholds. Web accessibility, at least as defined by WCAG standards, may be one of the only professions that demands simultaneous mastery of both the canon and the apparatus used to measure it in order to succeed.
Slowly, after learning from many experts in this industry, my confidence increased. I began to use popular browser accessibility toolbars to test individual web pages, and I formed a systematic way to move through the standards. Three favorites were the AIS toolbar, the WAVE toolbar, and the FireEyes plug-in, all for very different reasons. When I needed to zero in on individual elements, I could press a button with the AIS toolbar and see the discreet results on screen. When I needed to present on the overall accessibility of a web page to business stakeholders, the vibrant red, yellow, and green imagery of the WAVE results captured everyone's attention. And, later, when I wanted to plumb the depths of more complex scenarios like focus path and dynamic on-page content changes, FireEyes shone on the horizon like a beacon, like a promise of a tidier, exacting and more codified world. These tools quickly became my life line, and I was comforted in being able to stamp out a templated testing process. I could get a list of use cases and URLs from a team and, using a simple formula, I could provide a reasonably accurate estimate on how long it would take me to deliver audit results using my tools. I finally felt like I had a handle on the whole accessibility thing, and I slept peacefully at night... for about three months. Then, I met accessibility experts who focused on design and usability. My begrudging gratitude to those who ruined enlightened me on the concept of "It Depends" and forced me to see the inevitable alchemy in all of this: the user.
The truth is, real accessibility testing occurs when we're not looking. These testers don't have any special browser toolbars or add-ons. They don't have a conformance checklist to grade their experiences. Their acceptance criteria are anything but fair and unbiased, and they aren't documenting their results in a spreadsheet. Their audit results get published in the form of usability tests, brand loyalty, profit loss statements, and even sometimes legal actions.
As web accessibility consultants and web developers, we need to remember that we're building our ships on land. After constructing the timber frame, we hammer away at the bulwark and we secure the riggings. Let's be confident in that which can be codified and that which can be automated for accessibility - these are important in ensuring the integrity of the framework that we build. However, let's never forget the role that any of these tools play in accessibility assessment. We are really only addressing the obvious. We're removing obstacles and working in tandem with implementation teams to ensure that we've followed the blueprint. We'll never know if something is sea worthy until it sets sail. We hold our breath as we watch our designs get tossed around by the storms and choppy waters of real people in real situations. Will the ship stay afloat? Well, it depends.
You may find Elle at her blog ElleWaters.com or on Twitter, as @nethermind.