Not really specific to web development, but I see enough comments on open source projects about how to deal with {{some group here}} on the web that I think it’s a good perspective to remind web developers about.
Crazy Cripple Chick is just doing a rant here, but it’s such an important point:
What this means in web terms, where I’m going to just pick on our “most popular” disability for web developers:
When thinking of blind users, it can be easy to fall into the ARIA trap of “screen reader users”, when making a complex page, for example an e-commerce product page where, after a user clicks “add to cart” button, the update is seen in the top-right basket area (top-right for shopping carts is popular and most users would guess to look there first even on a new site, because it’s almost a convention).
Non-binary disability means that, okay maybe your visiually-impaired user is blind or mostly blind and using a screen reader, and so you thought to add in an aria live role to the shopping cart area in the upper right. Using a live role means the user with the screen reader can get the update that the content in the cart has changed without needing to navigate there (they can stay focussed on the button they’ve clicked), and when set to “polite”, it won’t interrupt the user who may be in the middle of something, but will wait to inform them when they’ve paused.
<p aria-live="polite" aria-atomic="true">
<span class="basket_total">€ 0,00</span>
<span>0</span> <span>products</span>
</p>
But your visually-impaired user might not be using a screen reader. Even people who would greatly benefit from using a screen reader on the computer may have never heard of one, and instead use an OS-built-in screen magnifier or just a sh*tton of browser-zooming, set their UI to a 600x800 resolution, and have their faces inches away from the screen. They’re visually impaired, have low vision, but didn’t get the aria-live role! It’s not available to other users. With the screen magnifier, a user may only see a part of the page equivalent to the size of a credit card.
That means when they click the “Add to cart” button, there’s no visible feedback. The upper-right corner of the page is waaaay offscreen. Being aware that the spectrum of bad eyes is broad means you may think of this scenario as well and do things like
- change the button. Maybe the text changes from “add to cart” to “added!”, or the colour changes to your usual disabled state, or adds a checkmark icon.
- add text near the button: “in cart” or even one of those temporary javascripted messages that fade out after a while (but not too fast… I always worry I can’t read them fast enough, and I’m not a slow reader really).
- possibly adding a sound. This may or may not be a good idea depending on what your page is doing and who it’s for, but especially for more application-like services, a small alert sound could be a worthy addition. Of course, users may not have speakers or hearing…
- similarly, for a device with a vibration API, you could add a tactile response. Like with the small sound, it’s dependent on other things you have no control over (user may be using a mobile device but a stylus instead of fingers), so needs to go with any of the first two things.
By addressing the sighted-end of the visual spectrum, you’ve now also reached out to the cognitive spectrum (people who focus their attention on the button while clicking it now also notice the feedback much easier) and then all your many users who, while they wouldn’t call themselves disabled in any way, tend to miss stuff going on in the rest of the page because in fact they too were focussed on the button they clicked.
Would your notification be noticed if your user simply had a very zoomed-in browser? What if your user’s browser, when zoomed enough, switches to your mobile stylesheet styles? Does everything still work as expected, even if the user is actually sitting behind a desktop machine and doesn’t have a touch screen? (besides the blind-means-ARIA fallacy, I’ve seen others (and have done it myself) fall into the “small screen means touch” fallacy.)
And lastly for this blargle on vision, there’s the tired eyes. Not just “I’ve worked all day and my eyes are tired”, but also like my husband’s eye which, at the beginning of the day, can see, but over time the areas start going black and it gets harder and harder to see anything with it. People with cataracts, changing eye prescriptions and degenerative eye diseases often have times where they see better and other times where they see worse.
A user could (hopefully!) visit your page several times. If you’re a shop, returning customers are what you want! It’s possible, and something a developer (and the UX person) would want to keep in mind, is a user with varying vision my rely on things being in certain places when re-navigating a page. Of course we constantly update our stuff, fix bugs, add new features, etc, and All Users Hate Change (because it makes them do extra work they previously didn’t have to do, to do the same task). Keeping in mind the idea of disability as a range rather than “there’s normal users and screen reader users” can help you decide where to place new things or how to change designs without getting too many users lost.
And, I guess I hope a bit of advocacy gets through as well… if you see people assuming binary disability (“you’re either completely paralyzed in that chair and can do absolutely nothing, or you’re perfectly-abled and therefore faking it because you’re lazy”), step in and educate. This is a hurtful ignorance.