there's a PDF in there, about 10 pages (but only 7 have what you want to read).

So, some researchers got together and were all like, "If more websites conform to WCAG, does this actually decrease the number of problems blind users run into on them?" and went to find out. Turns out it didn't seem to matter. Or, it mattered sometimes, but didn't stop a lot of problems because they aren't problems even mentioned in the WCAG.

First, the study:
They used 32 blind volunteers to test. These people did not represent the whole swath of blind users: instead I'd say they prolly cluster near the middle of the Gaussian curve. They were majority
- JAWS users
- IE users
- regular Internets users (not n00bs or grandmas)

A group of sites were put together for testing*. The researchers had trouble finding AAA pages so they just wanted to know, what's the difference in problems the blind testers ran into on non-conforming (to either WCAG1 or WCAG2) pages vs pages that technically conform? Now, there were fewer of certain types of problems on conforming pages. Pages who conformed more, had fewer of certain types of problems which were covered in the WCAG and the sites had conformed to those guidelines.

*List of tested sites: (Lainey Feingold's site) (Mike Cherim's site, though I'm not sure he still updates it)

Some sites were conforming to WCAG, some weren't. They're listed in order of most-conforming to Fails.

But as a web developer, you'd want to look specifically at Table 2, which lists most of the main problems people had.

What are those problems?

1. Content found in pages where not expected by users
Total user problems: 99
Covered by WCAG2? No.

2. Content not found in pages where expected by users
Total user problems: 88
Covered by WCAG2? No.

3. Pages too slow to load
Total user problems: 27
Covered by WCAG2? No.

4. No alternative to document format (e.g. PDF)
Total user problems: 17
Covered by WCAG2? No.

5. Information architecture too complex (e.g. too many steps to find pages)
Total user problems: 15
Covered by WCAG2? No.

6. Broken links
Total user problems: 10
Covered by WCAG2? No.

7. Functionality does not work (as expected)
Total user problems: 50
Number Covered by WCAG2: 13

8. Expected functionality not present
Total user problems: 29
Number Covered by WCAG2: 10

9. Organisation of content is inconsistent with web conventions/common sense
Total user problems: 39
Number Covered by WCAG2: 14

10. Irrelevant content before task content
Total user problems: 86
Number Covered by WCAG2: 39

11. Users cannot make sense of content
Total user problems: 65
Number Covered by WCAG2: 44

12. No/insufficient feedback to inform that actions has had an effect
Total user problems: 72
Number Covered by WCAG2: 49

In other words, a majority of problems faced by these disabled volunteer testers weren't specifically accessibility-related: they were usability problems, which hit everyone (though people with strong disabilities get hit harder and it takes them longer and more effort to overcome the problems). The majority of these problems seem they could have been fixed not with someone testing the pages for accessibility and blind adherence to the WCAG spec, but by user testing. And they could have hit most of these problems testing "regular" users; that is, you don't need to be able to find 30 blind people to find these problems: your grandmother and her bridge club would've found them just as easily.

Example of one of the problems (too much crap nobody cares about before the content the user's actually seeking):
Quote Originally Posted by article
For example, when users were seeking information about
insurance plans, the relevant page had lengthy descriptions
of why it was important to buy insurance before a summary
of insurance plans, the relevant content on the page. While
there is an SC (2.4.1) [edit: SC = Success Criteria] that
addresses skipping blocks of content repeated on multiple
pages (e.g. main menu bars), there is nothing in this SC,
or any other SC, describing the types of problems associated
with irrelevant content that is unique to a single page.
I remember we had this problem on many of our insurance pages. Where did the BS text come from? It was someone's attempt at having "content" for the freakin search engines. Oh I could rage for days about the crap people throw on websites "for the googles". I'm going to start saying this *hurts* your websites. Don't listen to the SEO craptastic advice about what's good for search engines. Getting people onto your page via SEs only for them to leave because your page is crappy is... a waste of everyone's time. And electricity. And bandwidth. And the blood of wee orphan children.

There was one disability-related issue that the study mentioned at length: when there was a video, the blind testers preferred a separate audio track describing what went on in the videos over a text description. Personal preference, one which both the developer and the website budget would have to keep in mind. Having a text description fulfilled a WCAG requirement, but users had (in this study) clear preference of one fulfilling method over another.

Anyway the point of the study was to see if the number of problems people encountered went down as sites went from non-conforming to WCAG1 conforming, or from WCAG1 to WCAG2 (which was supposed to be clearer to developers on what to do... testing on developers with little accessibility knowledge seems to show WCAG2 is almost as confusing and mis-interpreted as WCAG1).

Quote Originally Posted by article
These findings are quite unexpected. It seems that the
upgrade to WCAG 2.0 has not had the expected effect. For
WCAG 2.0, one would expect there to be a larger decrease
in the number of user problems from non-conformant
websites to Level A conformant websites than there was for
WCAG 1.0. However, the results show conformance of a
website to WCAG 2.0 Level A does not mean that users
will encounter fewer problems on it and as a result it does
not necessarily mean that following WCAG 2.0 will "make
content accessible to a wider range of people with


The move from WCAG 1.0 to
WCAG 2.0 has not increased the coverage of user
problems, as one would have expected.
They also then state their opinion,
Quote Originally Posted by article
The results showed that blind users reported problems when
they encountered unexpected content or when they could
not find content on a website. WCAG 2.0 does not cover
these problems. Some may assert that these are not
accessibility problems, but instead are usability problems
and do not need to be addressed in WCAG 2.0. The authors
disagree with this assertion for the following reasons. First,
web accessibility is about ensuring that people with
disabilities can use the Web. In order for this to be
achieved, we must address all of the problems that disabled
users encounter on web pages.
WCAG is about accessibility and the people who wrote it have a generally narrow definition of "accessible". Or, if they're going to add more usability stuff to it, the name should reflect it and the definition of "disabled" needs to be broadend. I agree that you can't really call something "accessible" if it's not usable. Kinda defeats the whole purpose of building something in the first place. It's like spending time making broken toys.

What I would rather see happen: that web developers, bosses, and budgets get off their duffs and actually bother doing real user testing on their sites and applications. Yup, it's no easy automated test a machine can do, but damn, it's practically guaranteed to increase profits if that's what your site or application does. Sheesh.

And what's easier? Getting grandma to try our your page/application, or reading the WCAG? If you have limited time and have to choose between one or the other, which one will bring you greater results and the most improvements, for all users? I'll claim the grandma. Ideally you'd do both, but nobody lives in an ideal world, we live in a crap world and it's just going to suck either way.

What do you think?