My graduate work was largely devoted to studying the way that users interact with software. These studies prodded me to consider as many types of users as possible. To think about the very young, about the very old. To consider buttons that are too close together, or improperly labeled. To think about the flow of software and how to best help to guide the user through its use. These studies had large and confusing names like Human Computer Interaction and Usability. In this graduate work we considered the hard of hearing. We marveled at the technological advances that were made for paraplegics. We considered low visions users, and the color schemes that work best.
We did not consider development for the blind.
Flash forward fifteen years. The technology is there for the blind now – there are screen readers. There are pieces of software that will allow the user to dictate letters and emails. There is an operating system, and a mobile operating system, that has a large development cycle dedicated to providing accessibility for the blind and visually impaired, among the other accessibility items that they provide.
My partner is blind and discovered that the iPhone provides more accessibility than any other phone that we have seen. As a mobile developer, I find this very exciting. Not only is iOS built for accessibility, but the SDK allows applications developed for it to become accessible. Of course, being headlong involved in Mach-10 development of the applications that I was working on, I made a note that I would ensure that my app was accessible when I could claim a little bit of time to circle back to it. I did run my app with VoiceOver running, and found to my surprise that some of it worked properly without trying. Smart VoiceOver, I thought.
Two years went by. During that time I watched as time and again my partner downloaded an app, ran it, and found that it was not an accessible app. This was frustrating for us both. Especially when I finally stopped, investigated and discovered – providing basic VoiceOver support is as simple as using good coding practice. It is built in. Seriously. When I look at an app now with VoiceOver turned on and hear the app say “button,” “button,” “button” – I cringe. All the developer has to do is – wait for it – name the button. Name the Text Field. Name the Image.
Sure – there are other things that can be done to embrace VoiceOver more thoroughly – hints that can be spoken if the user appears lost, for example. Not using UI constructs that will overlay other UI items – so that they do not get read interleaved (eg – ensure that the user hears the same thing that a sighted user would see). After this discovery for me – I have to say that never again will I see any reason for an app not to be accessible. It is simply lazy not to name your controls and UI elements.
For the record – the same holds true across the board – Screen Readers simply need elements of the UI to be named. Otherwise things happen like the user is read source code for a web page, or worse – nothing. A big blank. That happens on Windows if the elements are not named. Sure, the user can launch the application, but then… nothing.
I would challenge every developer on every platform to do all that they can to make their work accessible. Help to build an accessible world.