Actually it could be even easier to write tests for the screen reader workflow, since the interactions are all text I/O and pushing keys.
> I'm not convinced at all by most of the heuristic-driven ARIA scanning tools.
ARIA scanning tools are things that throw an error if they see an element that's missing an attribute, without even attempting to invoke a real screenreader.
I'm arguing for automated testing scripts that use tools like Guidepup to launch a real screenreader and assert things like the new content that was added by fetch() being read out to the user after the form submission has completed.
I want LLMs and coding agents to help me write those scripts, so I can run them in CI along with the rest of my automated tests.
Guidepup already exists, if people cared they'd use it for tests with or without LLMs. Thanks for showing me this tool BTW! I agree testing against real readers is better than using a third-party's heuristics.
Testing is a professional skill -- not all blind people are good at accessibility testing, just as not all sighted people are good at GUI testing.
My team has carved out an accessibility budget so that every couple years we can hire an accessibility consultancy (which employs a couple blind testers) for a few tens of hours of work to review one of our application workflows. Based on the issues they identify we attempt to write tests to prevent those classes of issues across the whole application suite, but our budget means that less than one percent of our UI has ever been functionally tested for accessibility.
It comes down to cost/benefit. Good testers are expensive, good accessibility testers doubly-so. And while I personally think there's a moral imperative and maybe a marketing angle, improving accessibility truthfully doesn't seem to meaningfully improve sales. But if the testing costs came down by a couple orders of magnitude it would be a complete game-changer.
Also, try using your app/site without a mouse. I've found it funny how many actual, experienced, sighted testers don't actually know the keyboard navigation for things like triggering menus, select boxes, etc. Personally, I don't think I could get used to the voice navigation myself, it's not that it doesn't work, it's just kind of noisy. Although, most sites are excessively noisy visually imo.
But usability testing with blind users presents some unique challenges. A past org I worked at ran some usability studies with blind users [1] and while I was only tangentially involved in that project it seemed that subject recruitment and observations were much more complex than typical usability studies. I haven't managed to run a usability study with blind participants at my current org though we have discussed ways we could recruit blind users for studies -- our software is complex enough that we'd need someone who is both blind and a prospective user of our software.
[1] https://www.bloomberg.com/ux/2018/08/28/visually-impaired-wo...
I still want them to be accessible!
(The amount of accessibility testing I want to do would bankrupt me very quickly.)