Accessibility Goodies from WWDC 2023 banner with iPhones that have accessibility symbols on it

WWDC 2023 Accessibility Goodies for Developers

This year, Apple’s annual Worldwide Developers Conference (WWDC) was a big deal! Among the many announcements this year, Apple introduced the new Apple Vision Pro headset which has arguably and understandably stolen the show this year.  Although Vision Pro may be the big talk of the town post-WWDC, there was also a myriad of exciting new reveals in Apple hardware and software.

Among the many announcements, the accessibility updates are one of my favorite parts of WWDC every year, and this year there were quite a few accessibility updates to the Apple ecosystem that will provide developers with new tools to help make their apps accessible.  Get the scoop on this year’s accessibility updates below!

Apple Vision Pro

The brand-new Apple Vision Pro headset was by far one of the show’s biggest stars this year.  According to Apple, the headset will be compatible with the platform’s pre-existing assistive technologies like VoiceOver and that a lot of the same accessibility features available in other products will be available on the VisionPro headset but reimagined specifically for spatial computing.  The details on VisionPro accessibility are extremely limited, but we hope that with its release the device will indeed prove to be accessible.

Assistive Access

Assistive Access is a brand-new assistive technology for cognitive disabilities. This technology allows iPhone and iPad to only display features the user wants, which greatly reduces the cognitive load on the user. For example, if a user only wants to use the phone, photos and messages assistive access can be set up to only have those apps available on their device.

A tablet and iPhone with Assistive Access table layout activated

Assistive Access allows customization not only for app availability, but also for specific features with apps; for example, if a user only wants to communicate with words the emoji keyboard can be removed from the messages app, leaving only the text keyboard.

iPhone with assistive access keyboard with emoji reactions

When assistive access is enabled, the device layout is replaced with one of two layouts: a grid-like layout where all of the apps are arranged in a grid, and each app’s image is more prominent than the text, or a row-based layout that lays out apps more like a list with the icon and text at equal prominence. Assistive access can be enabled and customized in settings.

Accessibility Audits

Developers can now perform accessibility audits for their app on every build right in Xcode.  Although Xcode’s accessibility inspector has an accessibility audit functionality, it is a manual process. Furthermore, the results of an audit are displayed in the accessibility inspector, causing a lot of going back and forth between the inspector and Xcode when remediating issues…the process definitely has room for improvement.

Now that Xcode has the ability to perform accessibility audits, the process is streamlined, automated, and efficient. A developer can perform accessibility audits right from UI tests, and no assertions are needed because if any issues are found in the audit the test fails. The results from an accessibility audit are easy to understand, with failures displayed inline along with a description, and additional details are provided in the report navigator.

Below is a code sample demonstrating how to perform an accessibility audit in a UI test.

func testAccessibility() throws { 
 let app = XCUIApplication() 
 app.launch()

 try app.performAccessibilityAudit() 
}

isToggle Accessibility Trait

Meet .isToggle(), the new accessibility trait for use with controls that have a toggle functionality.  Imagine a scenario where an app has an element that lets you turn a specific mode off and on; currently when a toggle type of element is used and made accessible VoiceOver will read it as ‘Element name, value, button’.

Let’s unpack that…sure, calling a control that has a toggle on/off functionality a button isn’t incorrect, but it’s also not as accurate or descriptive as it could be.  Applying the .isToggle() accessibility trait to that element, will make VoiceOver read it as ‘Element name, switch button, double-tap to toggle setting’ ; that’s a much more informative and accurate description of the element.  But wait, it gets better…the .isToggle()  accessibility trait is available in SwiftUI and UIKit!

Accessibility Notifications

The new Accessibility Notifications API allows developers to create multi-platform announcements to convey important information to users utilizing Assistive Technologies. When this API is implemented your app can send announcements about things like layout changes, and screen changes so the user can be kept up-to-date on the current state of the app.

Consider an app that has a button that opens the device’s camera; with the accessibility notification API, we can send an announcement to the user to inform them the camera is loading, once the camera has loaded and is active we can send another notification to let the user know that the camera is currently active.

This API even has the ability to add priority to notifications to handle alerts and updates that users need to hear; this will ensure that VoiceOver always speaks critical announcements while non-critical announcements will be skipped if VoiceOver does not announce them in time.

.accessibilityZoomAction modifier

Apple gave us a very cool new view modifier this year: .accessibilityZoomAction.  When an assistive technology is enabled, zooming content in and out can be difficult; that’s where the .accessibilityZoomActionModifier can help. When this modifier is used, elements can be zoomed in or out while an assistive technology is enabled and will even post the accessibility announcement when the zoom changes.

Direct Touch Trait in SwiftUI

The new .allowsDirectInteraction trait allows you to specify a region of the screen that represents an object that users interact with directly like a musical instrument. Consider an app that features a virtual piano where a user can hear the corresponding note when a key is activated; the standard VoiceOver behavior for navigating the piano keys would be to announce the name of the key and then play the sound that accompanies its activation.

This behavior makes it difficult to play the keys in quick succession. This is a great use case for the .allowsDirectInteraction trait because it will let you specify a region of the app where VoiceOver gestures pass through directly to the app as if an element was interacted with directly by the user.

In addition to the .allowsDirectInteraction trait, there are two new direct touch options supported: silentOnTouch and requiresActivation. Let’s go back to our example of an app that features a virtual piano; it would be a much better experience if a user could touch a piano key and just hear the corresponding note without any announcements from VoiceOver.

If we use the silentOnTouch option, VoiceOver will be silent when a piano key is touched, so the user will be able to hear the sound from the key without any additional audio from VoiceOver announcements.

The requiresActivation direct touch option will require VoiceOver to activate the element before the touch is passed through to the app, when this option is specified a direct touch element will not receive the user’s touch event until it has been activated with VoiceOver.  This trait is new in SwiftUI this year but is already available in UIKit.

accessibilityContentShape Kind

The new accessibilityContentShape Kind in SwiftUI will allow developers to update an element’s accessibility path to an existing SwiftUI shape provided by the developer. Let’s consider a scenario where there are circular buttons on the screen; if we specify a circle shape with accessibilityContentShape kind, we can avoid having the default VoiceOver cursor obscure elements on the screen due to the difference between the shape of the cursor and the element.  What’s great about accessibilityContentShape kind is that when it’s used it doesn’t impact the hit testing shape, it only impacts the shape of the accessibility content.

Block-based attribute setters

Block-based attribute setters let a developer provide a closure that’s evaluated when the attribute is needed instead of storing that value directly.  Each time a view is referenced or accessed by an assistive technology, the closure in the block-based attribute setter is re-evaluated; this makes it much easier to maintain up-to-date accurate accessibility attributes.  Block-based attribute setters will be available for UIKit.

In Summary

This year’s WWDC brought lots of exciting updates to the Apple ecosystem, from the brand new Apple Vision Pro headset, to additions to accessibility updates in UIKit and SwiftUI that will help developers make their apps more accessible. Even with all of the new accessibility goodies introduced this year, it is important to remember that there is no substitution for testing your app first-hand using Assistive Technologies!  I hope you all enjoyed WWDC and all of the new accessibility goodies for developers as much as I did!  Thanks for reading, and happy coding!

Photo of Kate Owens

About Kate Owens

Kate Owens is an iOS Product Developer at Deque Systems since 2019.
update selasa

Leave a Reply

Your email address will not be published. Required fields are marked *