CSS and accessibility

So I know accessibility is all about the markup, but would a site raise an issue with accessibility guidelines like wcag 2 or an organisation conducting an accessibility report, with invalid css.

So a validator will find errors in css files with -moz-border-radius or .htc links etc. But we ignore that, however we’re commissioning a usability/accessibility company to report on our site and we believe that invalid css (-moz, -webkit) will not pass.

Is there anywhere written that invalid css causes accessibility issues?

So I know accessibility is all about the markup,

Mostly, but CSS has a stong effect, as does Javascript.

Re errors…

There is a difference between a true error (goes against what is written in the specs you are following) and things the validator does not recognise but do not explicitly go against the specs (vendor extensions such as -moz, or using the CSS2.1 validator with CSS3 properties such as border-radius).

True errors should be corrected.

If you know the error being given by the validators is little more than, the validator does not understand that, it is not an error and you can ignore the error message.

Be aware that those vendor prefixes are more for testing something experimental in the wild. You can use them, but they are actually a protection against a vendor implementing something incorrectly. By using a vendor prefix like -moz or -webkit, they can test their implementation of that property without having any buggy behaviour appear in pages where authors use the spec version (border-radius for example). This prevents the need for hacks, and when the vendor believes their implementation fully follows the specs and they are satisfied with it, they will release a version of their browser that supports the spec version.

So when Mozilla felt Firefox did -moz-border-radius correctly, they released Firefox 4 with the ability to implement “border-radius” alone.

Is there anywhere written that invalid css causes accessibility issues?

There are articles talking about known issue such as when using display: none or visibility: hidden to hide content from sighted users with graphical browsers, where you intend text browsers, SEs and screen readers to get that content (screen readers generally honour display: none and visibility: hidden, so for things like dropdown menus, it’s recommended you use off-screen positioning rather than changing states, so screen reader users have the same access to the entire submenu just like someone with a text browser can, or how the googlebot can).

Juicy Studio: Screen Readers and display: none

There are WCAG-stated minimum colour-contrast settings, which is of course set by CSS. If the contrast between your text and the background is too low, then it will fail some (there are levels of accessibility, and the size and font of the text also matters) of your accessibility tests. (Remember also that extremely high contrast can also impact users negatively, causing headaches or making it harder for dyslexics to read… so choose high but not highest.)

One tool I use to check my colour contrast and luminosity ratios is Gez’ Accessibility Toolbar for Firefox. You can use the Colour Contrast Analyser to check your colour contrast. It’ll show pass/fail values for AA and AAA (I get plenty of AAA fails, so if I can’t change the colour I’ll try to increase the text size somewhat).
It’ll also check for ARIA landmarks, and has a table-markup checker.

.htc links etc.

Can you explain this?

however we’re commissioning a usability/accessibility company to report on our site

If they know what they’re doing, they will perform a mixture of automated testing and live human testing. You should let us know what they found as a note to other developers.

There is no such thing as a website without problems. Knowing that, you always want the accessibility/usability teams to find stuff. Stuff found is stuff known, and stuff you can figure out how to fix.

It is indeed harder to fix usability and accessibility problems later, when a site is nearly finished, than if you’d had U/A testing earlier in the process. You may find some problems that simply cost too much time and money to adequately fix if the site is near completion/is finished and online.

Thanks for the detailed explanation Stomme poes!

Alot of what you said I pretty much know, I really wanted to know whether a website accessibility report would object to using vendor prefixes and htc files. (Polyfills for IE and older browsers behaviour.htc to add border-radius etc to IE 7 and below, works pretty well. See CSS3PIE). Do vendor prefixes, polyfills and other non-standard CSS causes accessibility issues.

Since this is a government site it needs to be 100% accessible.

I wondered if you meant .htc files or not… ah, you mean the “CSS error” when you call them in the CSS. Gotcha.

Those are also fine (in CSS), and while some companies are so validator-frothing that they’ll create separate stylesheets just for IE where they stuff all their non-validating stuff (calling .htc files, using zoom and other “invalid” Haslayout triggers, filter comments), I find this a waste of bandwidth unless you’re doing separate stylesheets anyway.

I really wanted to know whether a website accessibility report would object to using vendor prefixes and htc files.

If they do, simply on the basis of file type or W3C/jigsaw CSS validator giving errors for vendor prefixes, fire them. If they know what they’re doing, they’ll know as much as what I’ve spewed out above, and then much more. They may give you warnings or things to watch out for (like I have), but if they say “this isn’t accessible simply because you are using -moz-something”, they’re bogus. It’s about what renders on the screen/speakers, what renders to AT (accessibility technology), and how do users interact with that.

Do vendor prefixes, polyfills and other non-standard CSS causes accessibility issues.

Interesting question, because the validator (who is doing absolutely nothing more than checking your CSS syntax, checking that it recognises all your listed CSS properties and values, and yeah there is that fore-background colour warning thing but I find that entirely useless since it seems to not be able to tell when multiple elements sit on each other with different bg colours, getting the contrast ratings totally wrong) doesn’t check for accessibility.

There are some “validators” who do, but the best they can do is check for certain things mechanically. Usability and Accessibility are not mechanically-checkable things. You may have heard of Bobby and Cynthia. These are (were) automated a11y testing tools.

Whether the properties from vendor prefixes can cause issues: they can when you do silly stuff (and you and everyone in your office misses it, but after it’s pointed out you’re all like, well that was silly).

Here’s a good example: the text-shadow property is, for many browsers, still in the prefix stage, and IE doesn’t support it at all if I remember correctly.

Some people have started to rely on that text shadow to make text readable. This is an accessibility/usability issue: any browser who does not show the shadow (old browsers, anyone who doesn’t support, or the shadows may still be too faint for those who DO support them) may leave users with light text on a light background (or other way around), making the text unreadable.
I’ve even started to see this garbage on regular WordPress sites.

But this has nothing to do with it being a prefix: it’s more that it’s new so people don’t think of this if their main browser does support text-shadow.
An old-fashioned (and still relevant) version of this is when an image is relied upon for background contrast. If the image doesn’t load for any reason, the contrast won’t be there, and the text may become invisible. I see this regularly.

@font-face is another one: unfortunately the specs for @font-face only let you call a font, but not couple font availability with the weight, size, or line-height that may best match. Anyone not able to load and display those fonts may get the default text, but at settings made for the special font. This easily makes text spill out of containers, lose contrast (as they spill out into somewhere without the necessary background contrast), get cut off or become too large/small (by insane degrees) to be read and used easily.

I’m not familiar with polyfills, but it did remind me of gradients and the rgba() background property. Basically, they need to degrade well, and be of sufficient contrast when they do render.

Basically, I’d say you can see where your failings are usability/accessibility-wise fairly easily if you keep an older browser lying around (maybe best if it’s not moz or webkit as well) where you can turn images, css, and scripts on and off. You’ll see what problems (if any) occur due to the -prefix and CSSS3 stuff NOT showing up.

Also, I suppose if you’ve got any of those webkit animations/transitions/transformations going on, those can be a problem (they can also help usability as well, esp for those with cognitive issues), but this company you’ve hired should spot those.

Tilted text can be difficult to read, esp if the browser renders the fonts badly and they lose anti-aliasing or good hinting.

.htc files basically tell IE to render JScript. If scripting is turned off or blocked for any reason, those scripts will not run. If you’re using them for things like whatever:hover (which also implements :focus styles for the IE’s who can’t show :focus for example), you’re simply out of luck: there’s little you can do except possibly recommend to the visitor that they enable Javascript (but know they may not be able to, which would be why they have it off in the first place).

Mostly, it’s just watching out for

  1. are there problems if the intended effect ISN’T rendered
  2. are there problems if the intended effect IS rendered

which generally has little to do with how you’ve called those properties.

Since this is a government site it needs to be 100% accessible.

It won’t be. It can’t be. The best you can do is meet your particular accessibility requirements.
Is this site available in ALL languages? If not, it’s not accessible to whomever can’t understand the language you use. Sign language? (some of the Deaf have a lot of trouble with English-as-a-second-language, or whatever your spoken language is). What about illiterates?

Is it fully available to users with little bandwidth/slow connections? This practically requires a plain-text document, which for most sites is a little impractical (and if you do serve a text-only version, you’ve got all those associated maintenance problems… and how do you offer/serve that text-only version?).

Somebody, somewhere, is not going to be able to use your site. Possibly a lot of somebodies, for reasons you cannot possibly cater to.

But this is why you’ve got an accessibility standard you have to meet: it tells you where you’ve got to work and where you must just leave alone.

Though if you’re in the US and doing Section 508, I gotta say: it’s got some rules in there that can hurt accessibility/usability. They are based on the old WCAG1 documents. I’ve seen sites that technically follow 508 to the letter (last one I looked at was State of Michigan website, though this was about a year ago) and it was not terribly user-friendly or necessarily very accessible. For example, alt text was filled in for images, but the text itself was kinda like, wut? All the 508 checker did was check if alt text was filled in.

A good U/A company will check that the alt text makes sense in its context, for example.

Someone should go through your site with a screen reader (or two), a screen magnifier, maybe a refreshable Braille device. Keyboard-only, Javascript on or off (WCAG2 is much more lenient about scripts… you may have them, but frankly don’t allow the lack of them to break anything on your site). Images on or off (mobile devices may block to preserve bandwidth). Printing?

Also, you may want to talk with Rguy84 here on the forums. Maybe I’ll twot this thread to him; he has to deal with gov’t accessibility sites himself.

Ah, to be clearer: the fact that you are using something that is not blatently in the spec (and thus invalid in the eyes of a validator) does not, in an of itself, cause accessibility issues. So, short answer is “no”. : )

For that matter, I’ve seen fairly accessible sites with invalid HTML (iFrames, <embed> tags, and missing closing tags (which the browser just adds in for you)). Invalid CSS has much less impact (depending on why it’s invalid) than invalid HTML, which I would worry more about.

Stomme that was awesome, the points you’ve made are brilliant especially about when CSS3 could become inaccessible like when text-shadow is not rendered and makes the text too light. Also you make a great point about validators and checkers, they can see the alt is filled but not context or proper.

This is exactly what i was after, if the client does wonder about any issues regarding new css or the accessibility report has any issues we can respond with some clear logic about why what we have implemented is accessible and ok.

So our objective was to create a site built using the latest features (HTML5 but using little HTML5 semantic markup.) We havent used sections and multiple h1’s as this could cause SEO issues. We have however wrapped navigtations and the header in the header and nav elements, and no style is being applied to these elements except display:block. We wanted to reduce the http traffic so we’re creating sprites and minifying/compressing all scripts.

Definitely going to keep a sticky note about the principle you’ve laid out:

  1. are there problems if the intended effect ISN’T rendered
  2. are there problems if the intended effect IS rendered

Also i guess not a site can’t be 100% accessible, how does one create an accessible website for illiterates?! :lol:

We are using the WCAG 2.0 guidelines as this is what the accessibility report will be based on we believe.

Thanks Stomme!

We havent used sections and multiple h1’s as this could cause SEO issues.

Robots don’t know the difference between a <section> and a <div> yet, but your reasoning is the same reason why I’m sticking to HTML4 header-levels. One h1 per page.

I’ve toyed with the idea of using the short HTML5 doctype (but setting the validator to XHTML1.0 Strict for checking I’ve dotted my i’s and crossed my t’s) and some of the new form inputs (they’re pretty cool and most degrade into regular HTML4 form inputs if the new ones aren’t supported, which is great), but I’ve been staying away from <nav> etc tags.
Jason Kiss has been testing various versions of HTML5 and ARIA roles with screen readers (read his article, HTML5 and ARIA) which notes some issues (some have been fixed, but there are always new ones).

So I’ve been sticking to HTML4 code, but using the ARIA landmark roles and a few other roles in forms. They’re pretty cool, and they work (they will also make the validator call your document invalid, because it hasn’t been updated for ARIA… tho the HTML5 validator at validator.nu does understand ARIA). They won’t replace the need for skip links or accessible error messages and form settings, but they can make your site much nicer to use for screen reader users (and that’s all ARIA is for pretty much, just that very targetted group).

If you haven’t been using them, check them out. The landmark roles are pretty simple and I think easiest to start out with.
Paciello Group has a nice page on basic landmark roles

And in forms, they kick butt (remember older readers, or older browsers, will not understand the new ARIA roles and attributes, so have to keep that in mind while building):
Required fields:
Linking surrounding input text to that input/label pair:
aria-labelledby and aria-describedby
Form error messages (he does with JS but can also be added via back-end):
aria-invalid, alert role

how does one create an accessible website for illiterates?!

I’m not entirely sold on the idea, but our local newspaper (regional local news) has an out-loud article reader built into the site.
Which is a good thing because their web site is horrid, with microscopic text and an amazing inability to find articles or photos from the paper even when you have the paper IN YOUR HANDS and type in titles, dates, or anything relevant. Lawlz.
English and Dutch:
Voorleeshulp op Sites in Drechtsteden (yeah I just copied it from a paper somewhere and wanted to be able to post it online, which strangely, they didn’t!)

Mass, If you are doing the site for an agency (federal or local) in the United States your site actually has to comply to Section 508 part 1194.22 versus WCAG. You can find details what these are at Section 508. Note that Section 508 meshes to WCAG 1.0 NOT 2.0, also these standards were made circa 2001. Yes they are in the process of being updated, which may happen by Dec-31-2011, but don’t hold your breath.

Three points about your question

  • 508 cares about color contrast - so make it suffincient. If I deem it is - i don’t check validity of CSS.+
  • make sure when CSS is turned off the page still makes sense.
  • to be honest, you are wasting your time worrying about -moz, -webkit. US Government uses IE 8 and sometimes 7, so that’s what it has to run on. Unless I missed something -moz’s just make the page prettier. Unless the -moz clutters the page, we don’t care

± Why don’t I check? If your CSS is trash, either you are making things way too hard on yourself. Or you’re going to give me a site with a junk layout, which I will say cool, re do it on your dime.

Yep we are using ARIA landmark roles for all major sections, need to understand them abit more tho, will definitely check out those links.

Also just did a quick check on validation with xhtml 1.0 strict compared to html5, found 45 errors html5 none. Mostly it was lack of closing img elements and wrapping input element in block level elements, couple of _target attributes, nothing major.

The site is actually a UK government website, and the company conducting the accessibility report will be using WCAG 2 guidelines. However the idea is to make the site accessible not to check boxes on a checklist, so any good advice to make a site accessible is always welcome.

Well we are using -moz-border/-webikit-border to reduce use of images. The idea is to replicate the current site with updated HTML and CSS for their new backend system. So all aesthetics must be the same and so instead of having 4 spans to add rounded corners we’re using CSS3PIE and we are able to see the rounded corners/gradients etc in IE8 and below.

Well the ‘target’ attribute is hardly good practice. Usually nominative markup grammar should be used for obvious reasons. The problem with WCAG 2 is you can customise it too much; so it only includes what you want it to say.

Obviously with UK website there is PAS 78 although they couldn’t even spell the word accessibility in that document correctly! We as taxpayers paid for that document too. I wonder if they have corrected those spelling errors yet? Anyway, most of it is down to degrading gracefully and allowing for reasonable adjustments anyway.

Ok, sorry I cannot talk about specs for the UK government.