Design and Development knowledge sharing for better accessibility

How Serious Is Your Design Practice About Learning?

Today I’m sharing a strategic approach that enables your design practice to learn more from the accessibility-related data you already have.

I will again skip the “how to” article formula and dry laundry lists of tactical “do this/don’t do that” mechanics. Instead, I’ll focus on helping you think about WHY you should do things differently so that your program can make impactful changes that scale.

The Problem

Teams make mistakes. It’s a fact. Period. Full stop. 

Digital accessibility is a shared responsibility between design, development and testing. Accessible solutions are first designed to BE accessible, developed to ACT accessible, and then tested to ensure that they ARE accessible. No one person is solely responsible for accessibility—it’s truly a team effort. So, it stands to reason that, collectively, all team members can learn from each other to improve quality.

Understanding where and when mistakes occur means we can target the how and why; ultimately making less of them. The further left in the creation process we can effect change, such as during the design phase, the greater impact we can have on quality, cost, and speed. 

Learning cannot be solely about looking at found defects or process failures and fixing them. That’s a reactive approach that will keep your teams in permanent catch-up mode because ‘learning’ is limited to that specific failure. Strong education-design methodologies unlock the ability to learn. In order to deepen and truly internalize understanding, you first construct the objectives and activities that will allow you to learn systemically. This enables a more scalable and sustainable approach: to be proactive wherever you can be, combined with anticipating moments in time when you can learn. From a design practice perspective, this means looking holistically across your design and digital accessibility programs to determine what you can learn and from where.  

Let’s look at a hypothetical, robust digital practice to examine some important—but potentially overlooked—learning opportunities.

Co-Design Sessions

Designers and developers should be designing and planning the user experience together, and having healthy debates, throughout the design cycle. Without those conversations, how will the designer know whether development can actually build the user experience they’re imagining? The absolute last thing you want to do is design in a vacuum and then toss the design over the wall to the developer. In twenty plus years, I have never seen that model be successful–no matter how many annotations have been painstakingly crafted within your wireframes or design comps.

Accessibility can be tackled as an embedded part of that dialog. The designer has an opportunity to educate the developer and share accessibility knowledge early. If all end users are kept front of mind, there will be less re-work for developers and designers alike.

If your accessibility program is just getting started, consider investing heavily in accessibility subject matter experts (A11Y SMEs) embedded within your design or user experience team(s). My personal experience with this approach saw a multiplier effect in reducing defects.

Design Peer Reviews

Peer-to-peer design reviews are the right place for the designer to defend their work and/or demonstrate that they have incorporated standards (brand, content, corporate, design system, accessibility, etc.) into their designs. It’s also a good place for a senior designer with extensive digital accessibility experience and/or a digital accessibility subject matter expert (SME) to not only check for standards, but to also educate younger designers on incorporating accessibility best practices into their design. An effective design has the potential to reduce accessibility defects made in the development cycle by 67%.

For those design teams that use Figma, axe for Designers can speed up design reviews. By running this tool, some accessibility testing can be completed on the design through automation prior to the design review session. Head into review with your ‘design slug’ showing what design items (like color contrast and touch target) have already been completed prior to the meeting. This will allow the conversation to focus (more efficiently) on other aspects of the design.

Usability Testing

As I stated in the last design blog, let’s agree that accessibility is rooted in usability. There are tons of insightful articles about this across the web so I won’t rehash that here. If you need to jump over to do additional research right now just remember, “Y’all come on back now!”

Design teams should be working with their UX research teams on usability testing. Some of that testing should be completed with People with Disabilities (PwDs.) As the saying goes, “Nothing for us, without us.” Your UX research team will assist you in documenting your users’ problems for re-design activities. In my design practice, I encourage the research team to ask the test subjects, whenever possible, to recommend a fix that would make the experience more usable for them. I encourage you to ask the experts so you don’t have to guess.

Best practice is to include the entire digital product team to witness (live or recorded) usability sessions. With this implementation, everyone will be able to see for themselves how PwDs can (or cannot) use the digital product. First-hand experience drives empathy which in turn fuels digital accessibility prioritization.

Acceptance Methodologies — aka Check The Build To The Design

Design, development, and quality teams often skip this phase of the development process. (Let’s agree that there are too many reasons why that isn’t pertinent to today’s blog to list here.) Someone in your design operation should be tasked with confirming that the build matches the design–right down to the pixel level placement–prior to investing time testing it during the quality phase. Does the prototype or sandbox version match what was documented within the wireframe or design comp? If not, is your team taking time to understand why that is and learn from it for future iterations? This needs to be done–period. The insights this analysis delivers is too critical to skip.

Agile Retrospectives

In my career, retrospectives have been the hidden fountain of opportunity. While agile retrospectives are at the end of the sprint–well past the design phase–I wanted to stress that your designers should be attending retrospectives to co-learn about what can be improved in the next agile sprint.

When time and energy is invested in learning the root cause ‘why’ of a problem, teams can quickly make systemic changes to ensure that the defect does not reoccur. When those learnings are shared across all the agile teams, your shop is being proactive at scale in preventing future defects.

We have already identified four learning opportunities that could be shared during the retrospective:

  1. Share learnings from co-design sessions so that other team members can learn from them.  
    • What knowledge was shared between the designers and developers?  
    • How can that be replicated so that everyone has that same knowledge and does not make those same mistakes in the future?
    • Does this have down-stream or up-stream implications that can be used to fix things at a larger scale?
  2. Share what was caught through peer reviews and axe for Designer testing results.  
    • What was caught early and how did the redesign effort solve the problem?  
    • Is there a savings metric that could be published to show the potential coding savings from finding the issue prior to any coding investment?  
    • What can other designers learn and incorporate into their designs?
  3. Share what was learned from accessibility user testing.
    • What was the difference between how the user actually used it and how we assumed they would use it?
    • How much did the re-work (and re-testing to confirm the problem was fixed) set the team back?
    • What assumptions and/or biases could be counteracted (and documented) so that future efforts are more successful?
  4. Share any gaps between the design and the build.
    • Can the root cause of the gap be determined?
    • What changes can be made to processes to ensure that there is less “throw away” work?

The teams should also incorporate some of these ideas into their retrospective ceremony:

  1. What were the developers’ issues found through automated accessibility testing?
    • As you trace through those defects, was there anything that could have been done within design to have prevented them?
    • Was the defect caused by lack of training or education? 
    • Is there training or micro-training that would help the teams avoid making the mistake in the future?
  2. What defects were found through manual accessibility testing with assistive technology?
    • What was the root cause of the defect?
    • Is the defect a known issue that should be added to automated testing tools so that manual testing isn’t necessary? Perhaps it was something that should be incorporated into the design standards.
    • Is the defect a WCAG best practice that should be standardized across your enterprise?
  3. Share trend data and/or build in gamification. (A little competition never hurt anyone!)
    • Showcase accessibility defect counts and other data such as sprint-to-sprint trends.
    • Show data in the context of each team’s results on a ‘leaderboard.’

Why I Love This Approach For Solving Design Problems At Scale

As you explore these design practice learning opportunities, you will start to see that there are moments where you can anticipate what you can learn, which tool or data set can identify the learning, and where (or who) you can learn it from. With this strategic approach, your teams can build efficient methods and processes that help them gather better data sooner, speeding them to the root-cause analysis phase. Spreading the discovery and sharing activity across many teams enables more to happen with less effort over time. Most importantly, with a shared goal of not making the same mistakes in the future, teams will work to understand the root cause–and find better ways to ensure their success building accessible (and therefore, better) user experiences.

Design Strategy Blog Series

As Deque continues to publish more about accessible design and design strategy, please submit questions or topics you would like to see us tackle. Comment below to send us ideas and content for consideration.

Photo of Matthew Luken

About Matthew Luken

Matthew Luken is a Vice President & Principal Strategy Consultant at Deque. Prior to Deque, Matthew built and ran U.S. Bank’s enterprise-level digital accessibility program. He grew the program from two contractor positions to a team of 75 consultants and leaders providing accessibility design reviews, compliance testing services, defect remediation consulting, and creating/documenting accessibility best practices across the company. The program leveraged 1,500+ implementations of Axe Auditor and almost 4,000 implementations of axe DevTools and Deque University. Matthew was also Head of UXDesign’s Accessibility Center of Practice where he was responsible for creating seamless procedures and processes that supported the digital accessibility team’s mission & objectives while dovetailing with the company’s other Center of Practices like DEI, employee-facing services, and Risk & Compliance. He and his team’s work has been recognized by American Banker, Forrester Research, Business Journal, and The Banker. In his user experience and service design backgrounds, Matthew worked with over 275 brands around the world, covering every vertical and category. He continues to teach User Experience, Service Design and Digital Accessibility at the college-level, as well as mentor new digital designers through several different mentorship programs around the USA.
update selasa

Leave a Reply

Your email address will not be published. Required fields are marked *