An Introduction to Native Mobile Accessibility – Featuring Deque University for iOS/Android

One of the most challenging aspects of mobile accessibility is that most developers,managers, user interface designers, software engineers, etc don’t have a disability that makes the use of websites and mobile apps difficult. That is, they aren’t able to view the process from a user perspective. It’s understandable that this would happen, but it nevertheless makes their job more difficult when it comes to accessibility.voice over simulation deque university

It’s easy for someone without limited vision to look at an application, see the red “X” and close out of a modal.  But what about those who can’t see?  Similarly, you probably pick up your phone and swipe across the screen to delete a text message without thinking twice about it.  But what about those who can’t perform this action?  Perhaps some empathetic users are conscious that these disabilities exist. But it’s still all too common for even the most empathetic of  users to miss some vital piece of information that may not be available to those who have a visual impairment – such as colorblindness.

In this post I’m going to cover how the flooding of the app market with new apps is often at odds with the goal of making native mobile apps accessible. Then I’ll illustrate how Deque’s approach to a11y instruction can help train developers to design apps with accessibility in mind.

The Problem: Reconciling Rapid “App Evolution” with Accessibility

Native mobile development is still relatively young compared to other forms of application development.  As native applications and APIs evolve the problem of accessibility becomes even more difficult.  It’s the nature of native mobile OSs to evolve quickly, but the needs of users with disabilities are at odds with this rapid app evolution.

Every year new APIs are released providing developers with the latest UI candy, but these new APIs are not fully vetted for accessibility by OS designers.  All too often developers aren’t even fully familiar with the best ways to use these new mechanisms for standard use, let alone the design practices that would make them accessible.  Native accessibility APIs tend to evolve much slower than the rest of the operating system, omitting mechanisms that could make these new OS features accessible, which makes native mobile accessibility exceptionally difficult.  

The fact is, there’s not enough native mobile accessibility experts in the world to keep up with the growing demand for newer, cooler looking applications.  It’s inevitable that most applications will get released without ever being touched by someone who fully realizes the importance of making an app accessible to users with disabilities.  

At Deque we confront that problem daily. It’s a big part of the reason why we’ve created  the ever-expanding Deque University: to help developers understand the difficulties faced by those with disabilities, and to provide them with the tools and training they need to make accessibility a reality.  

Above all, the purpose of this post is in line with our mission of inclusivity: To give you a greater understanding of  the best way to make your application  “accessibility friendly” so that everyone can use it.

Getting Started

Here I’m going to outline step by step instructions for Deque University that’ll enable you to gain a greater understanding of the difficulties that users with disabilities face.

  • Set up the Application

The very first step is to  get set up with the application.  

For the purposes of this post I’m going to use the Accessibility Inspector.  VoiceOver presents the same information to you audibly.

2) Activate Blind Demonstration Overlay

Now that VoiceOver is on bring up the application and open the main navigation menu.  This should be a basic navigation drawer menu, with headings Introduction, Labels, Hints, etc.  Let’s start with the labels example.  

  • When you click on the labels example it should bring up a view explaining the purpose of accessibility labels.  

Before moving on to the next views, take a look at the top right corner.  

  • There’s a little eye, with a check mark.  Activate this button.  You should now have a grey “VoiceOver Simulation” image overlaying your entire screen.  The view behind it  hasn’t  changed,  it’s  just  displayed to force you to experience  the application the way a blind person would. .  

3) Try the Accessibility Labels Example

  • Swipe around the application until you find the button labeled “Broken” and activate it.  
  • Now that you’re  on the “Broken” tab, attempt to navigate the application and figure out what it’s doing.  
  • Can you figure it out?  Don’t cheat by listening to the paragraph at the bottom!  It’s very difficult to understand what this view is doing, isn’t it? Next, navigate to the button labeled “Fixed” and activate it.  Is the purpose of the view more apparent now?  
  • When you think you’ve got  it figured out, turn off the Blind Simulation Overlay and have a peek to check..

4) Explore More Advanced Examples

Labeling accessibility elements is one of the most basic accessibility mechanisms.  Without basic labeling elements like images, “Go” buttons, custom controls and links can be very confusing.  This is just a very basic example.  As the examples get more advanced the Android and iOS versions of the application diverge a bit.  This is necessary as native controls and accessibility APIs have different strengths and weaknesses.

If you have any ideas for examples, or are curious about a particular control please feel free to comment!  You can also feel free to contribute to our open source application or post a discussion on GitHub.  

The Deque University applications have multiple examples of inaccessible use of native controls to help you understand how improper use of the accessibility APIs can confuse disabled users.  

Please feel free to explore the application further and be sure to make use of the Blind Simulation Overlay.   Practice using your apps in the same way and see if you catch yourself hiding information for only those who can see.
Chris McMeeking is a software engineer and architect at Deque Systems, leading development efforts on Deque’s native mobile accessibility analysis products. His journey in accessibility began through a project at the University of Michigan, The ASK Scanning Keyboard.  This application won multiple awards including the $100,000 Intel Innovator’s Award, runner up at the Mobile World Congress, and the Student of Da Vinci award from the Multiple Sclerosis foundation.

photo of Chris McMeeking

About Chris McMeeking

Chris McMeeking is a software engineer and architect at Deque Systems, leading development efforts on Deque’s native mobile accessibility analysis products. His journey in accessibility began through a project at the University of Michigan, The ASK Scanning Keyboard. This application won multiple awards including the $100,000 Intel Innovator’s Award, runner up at the Mobile World Congress, and the Student of Da Vinci award from the Multiple Sclerosis foundation. Chris is the lead developer behind the Android Analyzer, and an active member of the task force developing these new accessibility mobile standards.
update selasa

Comments 2 responses

  1. Good intro.
    I’d like to see a reference covering UI semantics for native mobile, for example, which of the ARIA roles are mapped in the iOS or Android accessibility APIs (e.g. toggle buttons, tab panel UIs), and (crucially) which ones are not.

    I’d also very much welcome some best practice suggestions for how to go about implementing the UI idioms that are not yet mapped into those native mobile APIs.

Leave a Reply

Your email address will not be published. Required fields are marked *