Thoughts About Functionality Missing From the WAI-ARIA Specification

While walking through Loring Park this morning through three inches of freshly fallen snow, I started thinking about the widgets we are making on our current contract and some issues I've had with WAI-ARIA.

One widget used a WAI-ARIA role that was interpreted by JAWS as a tab panel. JAWS took it upon itself to announce that the user should use the arrow keys to navigate the widget. The problem is that the widget wasn't a tabpanel nor could the arrow keys be used to navigate it.

The WAI-ARIA spec should allow the developer to associate key strokes with function and have the screen reader (optionally) read the navigation features of the widget when focus is placed inside it.

As a user of assistive technology tabs through the page or jumps to landmark, if the screen reader is configured to 'read navigation cues', it would speak something like:

'Grid widget. Keyboard Navigation: F5 to refresh. Arrow keys up and down to change rows. Arrow keys left and right to change columns.'

This way, the developer can provide a much better description to the user than having the screen readers guess at what the navigation of a widget is based upon the WAI-ARIA role. While following the design patterns of the WAI-ARIA widgets is a good baseline, the developers should have a means of conveying keyboard navigation of each widget to the screen reader.

I'll post this in the working group and see what people think.