Using our skills as technologists to protect the equal rights of users
As developers of web applications and websites, we create interfaces for people to use (for the most part). In doing so, we have a responsibility to make our applications not only usable by those people, but also to avoid infringing on their rights. In the tech industry we’re very often in positions to make an impact, so let’s talk about what this can mean for us and our users.
Civil rights are the rights of individuals to receive equal treatment, and to be free from unequal treatment or discrimination based on a protected characteristic like race, disability, gender or sexuality.
This means citizens should receive equal protection under the law, regardless of their abilities, what they look like, or who they love. People have fought for our civil rights–from housing to employment and education–and those rights will be eroded if we don’t continue to fight for them. Improvements in these areas are for the common good. As they say, equality is not like slices of pie: more rights for people of protected classes do not mean legal rights are taken away from someone else.
In the United States, there are a number of federal, state and local civil rights laws on the books that impact people’s lives every day. You might even recognize some as relevant to discussions we have repeatedly in tech (equal pay, age discrimination, family leave, etc.). These laws can come up in a number of settings: including, but not limited to, developing software for the federal government, education, and employment. Throughout the public and private sectors, the risk of legal action can be a good motivator to protect users from discrimination through our code, and rightly so.
Learn more about accessibility compliance: https://www.deque.com/accessibility-compliance/
The Right to Be Free from Barriers to Access
As designers and developers of software, we act as gatekeepers in people’s lives more than we realize. When done right, our software could be the difference between someone with a disability living an independent, productive life, or needing outside help to complete a task, including at their job. From banking, to grocery shopping to collaborating online, accessible software can range from somewhat convenient to a total game changer helping someone to excel in life. This applies to all digital experiences, for both contributors and authors as well as consumers.
No one automatically knows about digital accessibility, so don’t be discouraged if you have zero knowledge or experience with the subject. It’s something we all have to learn and work at, and we have to start somewhere! Over the past few years I’ve talked to many people who, after hearing about accessibility for the first time, have enthusiastically taken up the cause in their own organizations. Some have even become accessibility champions throughout the web development industry, and our collective reach has undoubtedly made an impact on the lives of users with disabilities. (Join us in the web-a11y Slack!)
“I’m disabled and I can’t use this site, why don’t they want my money?!”
It’s also true that some teams need more incentive to actually improve accessibility in their own organizations. So here’s some incentive: without attention paid to accessibility, your application and content may discriminate against people with disabilities. Not only does that carry a higher legal risk, but you’re also likely missing out on customer spending no matter what country you’re in. In the US alone, people with disabilities control over $645 Billion in disposable income. That’s a pretty large demographic to leave behind.
- Focus management of layers and modals. You’ll need to be familiar with tools like
element.focus(), and listening for key events.
- Fully disabling inactive layers and components with CSS
visibility: hidden, HTML attributes
aria-hidden=”true”, or the
inertattribute with a polyfill.
- Announcing view changes & asynchronous updates to assistive technology using ARIA Live Regions.
- Gracefully handling keyboard focus on deletion or removal of DOM nodes.
- Correct use of ARIA states, roles and properties.
- Automated software testing for accessibility.
Fortunately, these days it’s easier to achieve a baseline of accessibility with modern testing tools and procedures. Deque has a number of resources and tools to help with this, including Deque University, axe (powering the accessibility portion of tools like Lighthouse and webhint.io), and the entire WorldSpace product suite, amongst other great tools throughout the industry. However, at the end of the day, what matters most is that we improve things for users with accessibility needs, rather than what tools we use to get there.
“Power user” requirements with obscured controls, subtle design treatments, and forced “discoverability” don’t succeed in making your app cool and elite if they also make it much harder to use. Requiring memorization of excessive key shortcuts when the equivalent mouse flow is effortless is a display of power imbalance. It exposes a bias toward the designers and developers of said software over actual users who can’t know it as intimately as the team who created it: obscure icon buttons with no visual text labels also come to mind.
Organizations can combat bias by testing applications early and often with users with disabilities and taking their feedback seriously. Maybe you can’t redesign your entire application at once, but you can identify ways to simplify and streamline the experience in core user flows, replacing some of it one sprint or dev iteration at a time.
The Right to Safety
Technology can cause actual harm to people. Like that time someone tweeted a strobing GIF at a reporter with epilepsy. Or that time an accessibility plugin was hacked to mine cryptocurrency, impacting thousands of web users across the world. Or those times app developers added location sharing features that could be used for stalking or bullying.
We can’t police what every user does or contributes online, but there are several areas in the software development life cycle where we can take care to prevent unsafe situations:
- Consider up front how your designs could be used by bad actors to cause harm and abuse. Not an expert? Hire people from marginalized communities to consult with you.
- Combat prejudice in machine learning by being transparent about training data and looking for hidden biases.
- Establish reasonable terms and moderation processes for user conduct with public policies and hold users accountable should they break that code.
- Follow security best practices and perform regular updates to avoid compromising the integrity of your deployed software.
Thoughtful AV Design
For users with seizure risk, vestibular disorder, traumatic brain injury or motion sickness, navigating sites and apps full of auto-playing videos and flashing animations can be a dangerous task. As developers of these sites and apps, it is possible to minimize harm and still implement beautiful and innovative designs. Warn users of sensitive or flashing material on loading screens, and avoid auto-playing media without user consent (this is also relevant to saving everyone’s data plans).
Protecting the vulnerable from malicious third-party code
There are still more ways that we need to keep our users safe. Vulnerabilities in third-party dependencies and libraries we use could put our users at risk by allowing unauthorized code to run on their system, so those libraries must be kept up to date with security patches or removed entirely. There’s a reason so many developers use ad-blocking software: today’s modern internet is full of tracking scripts making requests to questionable domains, as well as malicious or unwanted cookies.
We can do well by our users across the board by deploying secure sites over HTTPS, checking the integrity of scripts fetched over the internet, and following other security best practices. But there is also a business argument to be had: Do we take ad revenue from a sketchy third-party platform or is there something safer for our users, like a subscription-based model? A theme has emerged in this article: the strongest defense of user safety is to plan for it earlier in the product development life cycle.
The Right to Privacy
How we design and develop user contact forms in HTML can also have an impact on privacy. Consider whether you really need to include a gender field for basic functionality. Can you mark gender as optional, or add a non-binary option to it? We shouldn’t ask for gender data simply for tradition’s sake if the data isn’t actually necessary.
User privacy and the Accessible Object Model
AOM has been compared to HTML5 geolocation, in that for privacy reasons, users would likely have to approve something in the browser to enable the new behavior. While the proposed design for AOM has evolved to use synthetic events, a permissions screen may still be a requirement in some cases. (There was also talk of making it HTTPS-only, but that’s pretty undecided at the moment.) When implemented in browsers, this powerful new tool must be used wisely, likely with redundant UI affordances to accomplish accessible tasks.
For example: a text input allows a user to search without agreeing to share their location through the geolocation API. Similarly, to use AOM features like Assistive Technology events, a user may need to approve the API. As developers, we need to provide UI affordances to complete the same task without requiring our users to sacrifice their privacy for access.
As you can probably gather by now, there are many ways we can impact our users’ civil rights for the better or worse. It might be overwhelming to try and hold all of these practices in mind, when everything inches out everything else for most important concern. When we say “accessibility first”, “mobile first”, “security first”, etc., we’re bound to fail or miss the mark in whichever ways weren’t prioritized. Some of the high-level items in this article might be out of your control as a developer, as well.
Therefore, it could help to start with something like this federal digital services checklist and adapt it to your team’s needs. Perhaps you can focus your attention to the relevant laws in your particular sector of technology. You should regularly evaluate how users’ rights will be impacted by each new design or feature you’re working on and start asking questions. In this line of thinking, you might ask things like:
- How would this feature impact the employment of people with disabilities?
- Are we leading all students to success through inclusive online education platforms and course materials?
- What privacy trade-offs are required in our APIs and user interfaces, and are we encouraging adequate alternatives?
- Are we introducing or tolerating bias in machine learning and algorithms, disproportionately impacting underrepresented groups?
- What processes do we have in place for reporting and escalating issues of user rights to privacy, safety, and access in our technology?