Hacker News new | past | comments | ask | show | jobs | submit login

Do you happen to know of best practices for simulating different disabilities, or some recording of somebody actually using some example well-designed accessible software? A video recording, to me as somebody who can see and hear and move just fine (for now), would help me understand what interaction methods don't work or are a struggle, which interaction methods are intuitive and effective, and which types of content are not recognized, not presented, or not important.



Lighthouse [1][2] Accessibility audit/ scores can be a good tool to check A11Y issues automatically, and it is fairly easy to integrate into your CI/CD Pipelines to get the report automatically to check for issues/ regression.

The web accessibility world is complex enough that simulating different methods of access like using screen readers or different font-sizes and doing manual tests for compliance is not particularly feasible if you want to make your content accessible to all.

[1] https://developers.google.com/web/tools/lighthouse [2] https://web.dev/accessibility-scoring/


I disagree. Automated results are not perfect and miss a lot of nuances, like element ordering, tab indexing, etc. Assuming you can’t find someone experienced with a screen reader to test I would invest in learning how to use one yourself. I’ve spent days without access to my screen in order to grow more comfortable navigating with audio. Of course, I won’t be as helpful or quick as someone who’s used a screen reader for years, but it’s better than blindly trusting lighthouse/web.dev


> but it’s better than blindly trusting lighthouse/web.dev

Interesting use of "blindly". Using lighthouse or other accessibility checkers is better than not considering accessibility at all, and has a much lower barrier for developers and development orgs than integrating screen readers into the development lifecycle.

I was in an org where our QA team actually used our sites with screen readers, and yet we still ran automated accessibility tests on our codebase. This is because we can catch issues earlier and more easily, and reduce the amount of issues making it to manual QA which is much more time consuming and expensive.

Manual QA testers using screenreaders are also not perfect and miss a lot of nuances :)


Some time ago I helped building https://ds.gpii.net which actually features a lot of ressources for developers regarding accessibility. It has a component search that links most of the automated testing tools ( scraped and curated from GitHub. ). I think actually videos of assistive tech use would have been a nice addition, but there is already a lot of so called quicksheets to get you started. One USP we wanted to have is a community of testers [1] unfortunately it did not really happen at the time (there is a form though for matchmaking): Anyone interested to continue the work on this kind of stuff should probably contact Gregg (Vanderheiden , who I guess wrote the first web a11y guideline back in the 90s and is a WCAG editor).

[1] https://ds.gpii.net/connect/testers




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: