I’ll admit, adding UI testing to an app that currently doesn’t have it included is probably stretching the definition of quick win, but the aim here isn’t 100% coverage - not right away anyway. Start small and add to your test suite as you gain confidence. Even a small suite of crucial happy-path UI tests will help to ensure and persist accessibility in your app. And the more you get comfortable with UI tests the more accessible your apps will become, because an app that is easy to test is also great for accessibility.
If you have a regulatory requirement to provide accessibility in your app (spoiler, you do) the chances are it will say you have a requirement to reach WCAG AA. While this is likely meaningless to anyone other an accessibility professionals, in short it means you are providing the minimum level of accessibility features required to make your app usable by the majority of people.
This post is about one such requirement, the jazzily titled Success Criterion 1.3.4. 1.3.4 is often overlooked, I’ve see it forgotten by accessibility auditors, overlooked by testers, removed by engineers, and ignored by designers. Yet this feature is one that is already enabled by default in your app. One that you have to choose to disable. For both Android and iOS when you create a new app, the app supports landscape mode out of the box, and all you need to do to continue supporting it is to build robust interfaces, and not disable it. Yet often one of the first things developers do when creating a new app is to disable landscape modes.
Many people don’t realise dark mode is an accessibility feature. It’s often just considered a nice to have, a cool extra feature that power users will love. But dark mode is also a valuable accessibility feature. Some types of visual impairment can make it painful to look at bright colours, or large blocks of white might wash over the black text. Some people with dyslexia or Irlen’s Syndrome can struggle to read black text on a white background. Like almost any accessibility feature, dark mode is about providing flexibility to your users so they can use your app in the way that is most comfortable to them.
How many shades of grey do you use in your app? OK, maybe thats a bit cruel towards designers, grey is a great colour, but the problem with grey is that it can be deceptively difficult to distinguish from a background. And this problem is not just limited to greys - lighter colours too can blend into the background. This effect can be heightened too for people who have blurred or obscured vision, or one of many forms of colour blindness.
Images are a major part of our apps. They add meaning and interest, they give your app character and context. The adage is that a picture is worth a thousand words. But if you can’t see the image clearly, how do you know what those words are?
If you aren’t providing image descriptions in your app many of your users will be missing out on the experience you’ve crafted. The result can be an app thats missing that spark an character, or worse an app thats just meaningless and unusable. Fortunately adding image descriptions is simple.
Each year at WWDC Xcode Santa brings us exciting new APIs to play with, and this year our accessibility present is Customized Accessibility Content. This API flew under the radar a little, I’m told this is because it’s so new there wasn’t even time for inclusion at WWDC. But this new feature helps to solve a difficult question when designing a VoiceOver interface - where is the balance between too much and too little content.
I was supposed to be attending the 2020 CSUN Assistive Technology conference to present a couple of talks, unfortunately with COVID-19 starting to take hold at that time, I wasn’t able to attend. In lieu of attending I decided to record one of the talks I was scheduled to present on Accessibility in SwiftUI.
SwiftUI is Apple’s new paradigm for creating user interfaces on Apple platforms, and it has a bunch of new approaches that really help create more accessible experiences. The talk is based on the series of SwiftUI guides available on this site.
Live Regions are one of my favourite accessibility features on Android. They’re a super simple solution to a common accessibility problem that people with visual impairments can stumble across.
Say you have a game app, really any type of game. Your user interacts with the play area, and as they do, their score increases or decreases depending on your customer’s actions. In this example, the score display is separate to the element your customer is interacting with. For a blind or partially sighted user, how will they know their interaction has had a consequence against their score?
A11yUITests is an extension to XCTestCase that adds tests for common accessibility issues that can be run as part of an XCUITest suite. I’ve written a detailed discussion of the tests available if you’re interested in changing/implementing these tests yourself. Alternatively, follow this quick start guide.
Getting Started # Adding A11yUITests # I’m assuming you’re already familiar with cocoapods, if not, cocoapods.org has a good introduction. There is one minor difference here compared to most cocoapods, we’re not including this pod in our app, but our app’s test bundle. This means your podfile will look something like this.
Accessibility is important. We can take that as a given. But as iOS devs we’re not always sure how to make the most of the accessibility tools that Apple have provided us.
We’re lucky as iOS developers that we work on such a forward-thinking accessibility platform. Many people consider Apple’s focus on accessibility for iOS as the driver for other technology vendors to include accessibility features as standard. To the point that we now consider accessibility an expected part of any digital platform. This was not the case before 2009.