We strive to make our visual interfaces intuitive to make it easier for our users to understand and use our apps. However, for users who rely on Voice Over, often the visual interface is secondary or even invisible, within their experience of the same app. Highly recommended by Apple, but rarely done well, it's possible to communicate the visual interfaces, via accessibility traits, labels, values and hints. If done well, a blind person could do the same tasks in the same app, as normally sighted person.
Sounds easy, right? But making sure ever element of the interface and ever step of every flow make sense for many different use case, is difficult and time-consuming. As we are independent team, we choose to spend the time to create an fast, inclusive experience.
The most time is in the details and the details can make or break the experience. Like the adaptive tab bar, which first shrinks labels then entirely removes them after a month... adaptive accessibility could remove certain hints and replace labels with shorter versions letting experienced users to move faster through familiar flows.
Hopefully, more apps will take the time to make their apps more inclusive. To encourage this, we have broken down the main aspects of iOS accessibility and code on how to include them in your apps, and a few tricks we learnt.
Traits allow Voice Over to know the type of element and what interaction it has. So if a user selects it, Voice Over can offer more relevant information at the right time to the user.
All standard UIKit components provide correct accessibility traits, however when building custom UI elements you have to maintain traits manually.
There is one private trait used for back button in UINavigationController that we found particularly useful when implementing on boarding in our apps, where we use custom chevron button with customized accessibility label. Adding following accessibility mask makes Voice Over to pronounce otherwise regular UIBarButtonItem as back button:
Accessibility frame defines the focus area for accessibility element.
When used within scroll views, it affects the way voice over scrolls content during navigation between elements. We use extended boundaries for buttons and text fields to make sure that they don’t stick weirdly to the bounds of scroll view, keeping enough space around and providing better visual feedback.
Accessibility frames are calculated in screen coordinates, UIKit has a helper function for that. However, views with custom accessibility frames contained in scroll views have to recalculate their frames on scroll. To do this we need the help of -scrollViewDidScroll:.
Trapping voice over within modal view
We use custom modal transitions in Money In and Out, which don’t cover screen entirely, leaving a small transparent overlay to give context to the view beneath. This is the same for action sheets.
What we would expect is Voice Over to be trapped within our modal view, however, it allowed the user to navigate back into the view below... the accessibilityViewIsModal property should remedy this, but it didn’t work for us.
After some research, we discovered that system UITransitionView is used as a container for modal transition, and always returned NO from accessibilityViewIsModal, so we made a patch to manipulate it:
Annoyingly, UISearchControllers without dedicated controller for displaying search results trap the focus within search bar. We use the same accessibilityViewIsModal trick to allow Voice Over to normally navigate within interface:
Navigation controller & back button
When the layout changes or a new view is shown, frustratingly the standard behaviour of UINavigationController tends to set focus on back button. So, instead of announcing the name of the new view, when it opens, Voice Over would instead announce the back button. Unfortunately navigation controller does not expose title label, so we had to traverse the navigation bar hierarchy, like this: