[Video] Android Accessibility APIs – How a Google+ Post is Implemented

Share on FacebookShare on LinkedInShare on Twitter

Let's talk about native Android accessibility. If you watched my video from last week, I chose Google Plus as our sample application and reflected on the good and bad of the Google Plus single post layout as it affects accessibility. This week, let's a bit more about how they accomplished this result. Going forward into next week, we'll wrap up with an account of how you might be able to imitate and implement some of the best practices yourself.

Follow along here in my recorded walk through if you like:

Modifying UX for Accessibility Users

Let's start by looking at the Google Plus sing post format again. Specifically starting with the post content, you'll notice that as a Google Plus user you're presented with navigation arrows that are not normally there. They appear to be detecting that I have an accessibility service running.

These arrows are going to be useful for certain types of accessibility users, and they're going to be annoying for other types of accessibility users. I think there's a better way to accomplish the user design pattern here.

But basically, what they're doing is they are detecting that there's an accessibility service active, and they're adding these arrow buttons on top of this view.

And that's an important thing. The user experience for AT (assistive technology) users is going to be different for the user experience of normal users, which, again, sometimes is good. It's sometimes bad. I'm here to talk about the implementation today, not about the accessibility ramifications.

That's one point, is they are clearly detecting the fact that there's an accessibility service there and responding to and adding things to the UI. The second thing they've done, if ... they've done some user interface wrapping and hiding of things, right? And in particular, they have hidden these controls down here from TalkBack focus. TalkBack doesn't know that there are active controls down here, so they've hidden those from the assistive technologies. They're not rendered to the assistive technology.And so what they've done instead is, I imagine, they have done one of two things. And again, I can't really confirm this, but

They have either set important for accessibility on the entire layout and made these things inactive, right? What setting important for accessibility equals yes to a layout does is it will collect all of the informative content within that layout, the content descriptions of non-active images.

The text of text in text views. The little 17s and plus ones and all of these little bits of information for the inactive elements, and that control is going to collect them together and report that back as one focusable view.

Making a Layout Accessibility Focusable

And that's cool. It collects and associates all the data together into this one big focus rectangle, although it's also bad from other points of view. Again, go back and watch my prior movie to see all of those things.

The other thing, that they may have done is assigned a content description onto the layout instead, right?

There are two ways to make a layout that wouldn't normally be focusable ... accessibility focusable. Please, not focusable, accessibility focusable. There are two ways to make a layout accessibility focusable. One is to say important for accessibility equals yes, and the other is to assign it a content description. And they may have gone through and manually done that collection of information and assigned it to the content description of that layout. I don't know which. I can't tell which based on the information that I have available. But those are the two ways that you can achieve this type of data association.

Obviously, I would recommend that important for accessibility equals yes solution if you want to do something like this yourself. The reason I would recommend that is maintenance, right?

Why go through and collect all of that information when the operating system can do that for you? And that's also going to end up being more accessible in the long run, right? Because then, different assistive technologies can gather that information differently.

Different operating system versions can gather that information differently. A Samsung user may get used to a different representation of important for accessibility equals yes than a Nexus user does. And so allowing the operating system to do the thing that it does is both more maintainable for you, and also more accessible. Because then, the users of those different devices get used to the way those things happen.

And if you just go in and collect all that information for your application yourself, while your user experience across the multitude of devices will be the same, that's not the thing that you want to go for. You want to go for a user's experience of different applications on their own device. And that is best achieved just by doing important for accessibility equals yes on this layout, rather than forcing them to view that content the way you think they should view it.

Android Accessibility API: Accessibility Actions Menu

The last thing I want to point out about this is a cool feature of the Android accessibility APIs that makes all of this make functional sense. Remember, there are actions down here that ... There are active controls down here that are hidden, right? And so what we have to do is, since we've hidden those things from the assistive technology, we need to be able to perform those actions. And what we've done is they've added this menu here with accessibility actions, and you can see.

Now, I have activated ... I double clicked on the layout, and I have these actions that I can do now. I can comment. I can see full posts. I can get the full, detailed post. I can plus one the post. I can reshare. I can open the link.

All of those things that those active controls allowed me to do, I have access through this special, contextual accessibility actions menu.

And so like I said, next week, what I want to do is I want to create my own version of a social network share and show you how I did it. Like I said in my video last week, I think this does some good things and some bad things. And I think that there is a combined approach to this that would ultimately be the most accessible.

About 

Chris McMeeking is a software engineer and architect at Deque Systems, leading development efforts on Deque’s native mobile accessibility analysis products. His journey in accessibility began through a project at the University of Michigan, The ASK Scanning Keyboard. This application won multiple awards including the $100,000 Intel Innovator’s Award, runner up at the Mobile World Congress, and the Student of Da Vinci award from the Multiple Sclerosis foundation. Chris is the lead developer behind the Android Analyzer, and an active member of the task force developing these new accessibility mobile standards.