How to address WAI failure

Hi Folks,

When I run a WAI validation on, I get the following failure:

Rule: 13.2.2 - Documents are required to use META elements, that are defined as required, in Head section.

* Failure - Document does not contain a META element with the required name: language or language does not have a 'content' value.

Could you help understand what is the mistake and how to fix?



Sounds like they specifically want a <meta tag, but I use this

<html xmlns='' xml:lang='en' lang='en'>

although the W3C says you can also use

<meta http-equiv="Content-Language" content="en"/>

I thought it was more of an either or choice, but as your site has the html lang= like mine does, maybe WAI wants both?
and then there’s the HTTP header too

Content-Language: en

I’ll leave it to you to read about it rather then try to explain what it says.

This is what I have currently

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "">
<html xmlns="" lang="en" xml:lang="en">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<meta name="description" content="The Boston TenPoint Coalition works to mobilize the community on behalf of a primarily African American and Latino population at high risk for violence, drug abuse, and other destructive behavior." />
<meta name="keywords" content="Boston, TenPoint, Ten Point, ten point plan" />

What more do they want?

I’m guessing they want the meta http-eqiv too. I don’t see why it would be needed when it’s already in the html tag. Guess I need to read that linked info again myself. Especially since I’m not using it now either.

There is no such thing as a WAI validation service. All these websites are good for is to provide false positives and negatives. For a WAI validation service to be any good, it must be able to analyse the contents of an image, determine how it is used in the context of that webpage, and compare that to a text analysis of the alt attribute. This is the WAI requirement for all images, and any WAI validation service which can’t do this simple task is useless.

The only way to verify WAI requirements is to manually go through the list and the source code of each page, and check that all the requirements are met.

I read the W3C article again, slowly and more carefully this time, and it seems the html tag attribute is for processing and the meta and HTTP header is for the target user.

I just added both and it didn’t break anything although this might affect users that don’t have en in their Accept-Language GET header

Then again, if someone can’t read english I guess my site is partially useless anyway.

I’ll try going to a site that isn’t en and see what my browser does - if I can find one that uses a non-en Content-Language header. It seems like it’s not frequently used.

EDIT: Instead of hunting for a site, I changed 2 of my browser’s (Firefox, Opera) Accept language so as to not include en. Everything still loaded OK. So it looks like I’ll keep it there.

I’ve done a little more research. It seems the HTTP Accept header can be used to serve language preferred pages to visitors.

I’m still not sure when or how the Content-Language header or meta http-equiv Content-Language come into play, if at all. Maybe for font switching? LTR to RTL rendering?

Hopefully some “international” forum members will have something on this.

In terms of accessibility, it seems most important when there’s more than one language in the same page. But I fail to see how it helps for a single language page. Except maybe for text-to-speech visitors.

Hi Folks,

This may be one of those times when I am not going to worry about not passing one feature in a validation ( When I checked against CSS, HTML, and 508, the page passed with no problem. Since the site is based on a template and the homepage has the most involved code (a JavaScript slide show), I pretty sure that the rest of the site will be OK.

Thanks for taking the time to help me.


vrmartin, Just to back up what others have said. You cannot use Cynthia and claim that your website passes 508 or WCAG, it’s simply inaccurate. :slight_smile:

The facts are that there’s no way an automated tester can check against every single element in the guidelines because those kind of testers only examine code, they cannot understand context or the visual elements which WCAG follows. If you are just using that website to check if your design meets WCAG accessibility guidelines, I and any other pro-accessibility web designer can say for sure that it’s highly unlikely that your website will meet the full set of recommendations (unless you manually checked to ensure this was the case). Your basing your accessibility compliance on something that can’t do it’s job properly. :slight_smile:

I wish I had a source for this, but instead it’s rather rumour (got it originally from Dan Schulz and have read it here and there over the years) that some AT will use the meta language tag to know which language to use when reading out a page (if it does so).

For this reason, my pages for the last year or two have had both the lang attribute in the html tag AND the meta tag with the lang— and I’ve put that meta tag before the title. Nothing like having your screen reader defaulted to Dutch trying to read out an English title : )

I can verify that having both doesn’t cause any problems, or at least, for all my pages I have not run into trouble by having both.

One issue I ran into long ago was on a page set to Dutch lang in both meta and html tag and links the backender decided to make that, after clicking, take you to another version of the page in another language.

I found that simply taking JAWS7 through the page and the links ended up putting the entire rest of the (Dutch language) page in Portuguese (Dutch words on the page pronounced as if they were Portuguese), because that was the last link tabbed through (not clicked). Each anchor had its own lang attribute for the language it was referring to.
The specs say a lang attribute is local and should only apply to the element it’s set on but this one time JAWS didn’t seem to follow that. However I consider that a bug in the reader, and I haven’t checked with JAWS10.

Hi Folks,

I used the W3C validators for HTML and CSS.

I would welcome a no-cost service or easily followed systematic way of checking for 508 and WAI compliance on a Mac.

I am a web hobbyist and don’t derive a living out of web development, however, I think accessibility and usability are important.


There is no “systematic” or “automated” way to check for 508 or WAI/WCAG compliance on ANY OS. The reason for this as stated before is computers are not smart enough to be able to make such contextual and visual decisions, the day a computer can accurately test against all levels of WCAG or accessibility / usability recommendations will be the day Skynet goes live and the Terminators kill us all. You should be checking for compliance manually as it’s the only way to ensure your website meets such standards. Any service claiming to be able to ensure your site meets such accessibility recommendations is lying, it’s as simple as that. :slight_smile:

This is true. But I think they make a good starting point as they do catch the code stuff. And if you follow through on the “check manually checklist” eg. contrast, blink rate, etc. then you should be improving things for those that need it.

I am a web hobbyist and don’t derive a living out of web development, however, I think accessibility and usability are important.

Well, if you’re building a site that MUST follow 508 compliance (dunno when that’s getting upgraded in the States, here in NL we’re getting a Web GuideLines update to WCAG2 this summer), then you’ll need both the automated tools, as Mitt… Mittenseasagu… as the guy above said, to catch the easy stuff, and then go through it yourself… but then also consider hiring someone who checks these things for a living, because not passing 508 is surely grounds for a lawsuit or something, which your client will not like : )

If the site doesn’t HAVE to be 508 compliant then I’d say automated testing, common sense and some self testing (you with a screen reader or magnifyer, on one of those crappy burnt-out lo-contrast monitors, with a dying jitterty mouse, and with no mouse, no JS, no images, text-only, etc) or better yet actual user testing if you happen to have anyone using AT or known web surfing issues/disabilities, all that should be enough for an average site meant for everyone.

508 is a very specific section of the law dealing with sites that either play active role in doling out Federal funds or information about Federal funds or themselves receive Federal funds. This pretty much means “government sites”.

Heck, while you’re at it, since you do actually care about accessibility, do some dirty user testing anyway. It’ll point out the obvious (not to you) problems with layout, navigation, wording, etc with just a few people. Check out Steve Krug’s (new) book: Rocket Surgery Made Easy. I’m going through it now myself. No point in saying, Hey, my site’s accessible because it passed Cynthia or meets WCAG2… if people can’t figure out how to use the darn thing because of poor wording or weird setups.

C. Ankerstjerne put this best,

The markup validator checks that the page can be read correctly by a program. For this purpose, another program is always best. Accessibility is about the user’s ability to use the page, which is best checked by another human.

You can’t do a (meaningful) accessibility validation with an automatic tool. It requires human interaction and assessment.

Cynthia et al can be useful for checking some trivial points, in the same way that spellchecking is a good first step for a writer, but doesn’t guarantee great literature.

in the same way that spellchecking is a good first step for a writer, but doesn’t guarantee great literature.

Heck, it doesn’t even guarantee good spelling! Esp with languages like Engrish : ) Eyes before ease, except after seas.

:lol: I just got this in an email

Off Topic:

I cdnuolt blveiee that I cluod aulaclty uesdnatnrd what I was rdanieg. The phaonmneal pweor of the hmuan mnid, aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it dseno’t mtaetr in what oerdr the ltteres in a word are, the olny iproamtnt tihng is that the frsit and last ltteer be in the rghit pclae. The rset can be a taotl mses and you can still raed it whotuit a pboerlm. This is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the word as a wlohe. Azanmig huh? Yaeh and I awlyas tghuhot slpeling was ipmorantt! If you can raed this forwrad it

Yeah, that’s old news by now. Besides, it looks like approximately 35% of the posts on SitePoint these days. :shifty:

It also only works well with either native speaker readers or those with about as much familiarity with the words. If you were learning Dutch and I did that in a Dutch message to you, in words you have indeed already learned (but recently, and aren’t used to reading much in it), you’d have much trouble.

And of course anyone with any other issues reading are going to miss that. I’d expect either dyslexics to either not get a word out of that, or maybe strangely it may look perfectly legible.

Well, I could get every word from that [demo] fairly quickly at near-normal reading speed, and it didn’t look perfectly legible either. So like I usually say; I am not a good example of a dyslexic person that has major problems reading.

Although my grammar is usually shoddy. :lol: