I wondered if you meant .htc files or not… ah, you mean the “CSS error” when you call them in the CSS. Gotcha.
Those are also fine (in CSS), and while some companies are so validator-frothing that they’ll create separate stylesheets just for IE where they stuff all their non-validating stuff (calling .htc files, using zoom and other “invalid” Haslayout triggers, filter comments), I find this a waste of bandwidth unless you’re doing separate stylesheets anyway.
I really wanted to know whether a website accessibility report would object to using vendor prefixes and htc files.
If they do, simply on the basis of file type or W3C/jigsaw CSS validator giving errors for vendor prefixes, fire them. If they know what they’re doing, they’ll know as much as what I’ve spewed out above, and then much more. They may give you warnings or things to watch out for (like I have), but if they say “this isn’t accessible simply because you are using -moz-something”, they’re bogus. It’s about what renders on the screen/speakers, what renders to AT (accessibility technology), and how do users interact with that.
Do vendor prefixes, polyfills and other non-standard CSS causes accessibility issues.
Interesting question, because the validator (who is doing absolutely nothing more than checking your CSS syntax, checking that it recognises all your listed CSS properties and values, and yeah there is that fore-background colour warning thing but I find that entirely useless since it seems to not be able to tell when multiple elements sit on each other with different bg colours, getting the contrast ratings totally wrong) doesn’t check for accessibility.
There are some “validators” who do, but the best they can do is check for certain things mechanically. Usability and Accessibility are not mechanically-checkable things. You may have heard of Bobby and Cynthia. These are (were) automated a11y testing tools.
Whether the properties from vendor prefixes can cause issues: they can when you do silly stuff (and you and everyone in your office misses it, but after it’s pointed out you’re all like, well that was silly).
Here’s a good example: the text-shadow property is, for many browsers, still in the prefix stage, and IE doesn’t support it at all if I remember correctly.
Some people have started to rely on that text shadow to make text readable. This is an accessibility/usability issue: any browser who does not show the shadow (old browsers, anyone who doesn’t support, or the shadows may still be too faint for those who DO support them) may leave users with light text on a light background (or other way around), making the text unreadable.
I’ve even started to see this garbage on regular WordPress sites.
But this has nothing to do with it being a prefix: it’s more that it’s new so people don’t think of this if their main browser does support text-shadow.
An old-fashioned (and still relevant) version of this is when an image is relied upon for background contrast. If the image doesn’t load for any reason, the contrast won’t be there, and the text may become invisible. I see this regularly.
@font-face is another one: unfortunately the specs for @font-face only let you call a font, but not couple font availability with the weight, size, or line-height that may best match. Anyone not able to load and display those fonts may get the default text, but at settings made for the special font. This easily makes text spill out of containers, lose contrast (as they spill out into somewhere without the necessary background contrast), get cut off or become too large/small (by insane degrees) to be read and used easily.
I’m not familiar with polyfills, but it did remind me of gradients and the rgba() background property. Basically, they need to degrade well, and be of sufficient contrast when they do render.
Basically, I’d say you can see where your failings are usability/accessibility-wise fairly easily if you keep an older browser lying around (maybe best if it’s not moz or webkit as well) where you can turn images, css, and scripts on and off. You’ll see what problems (if any) occur due to the -prefix and CSSS3 stuff NOT showing up.
Also, I suppose if you’ve got any of those webkit animations/transitions/transformations going on, those can be a problem (they can also help usability as well, esp for those with cognitive issues), but this company you’ve hired should spot those.
Tilted text can be difficult to read, esp if the browser renders the fonts badly and they lose anti-aliasing or good hinting.
.htc files basically tell IE to render JScript. If scripting is turned off or blocked for any reason, those scripts will not run. If you’re using them for things like whatever:hover (which also implements :focus styles for the IE’s who can’t show :focus for example), you’re simply out of luck: there’s little you can do except possibly recommend to the visitor that they enable Javascript (but know they may not be able to, which would be why they have it off in the first place).
Mostly, it’s just watching out for
- are there problems if the intended effect ISN’T rendered
- are there problems if the intended effect IS rendered
which generally has little to do with how you’ve called those properties.
Since this is a government site it needs to be 100% accessible.
It won’t be. It can’t be. The best you can do is meet your particular accessibility requirements.
Is this site available in ALL languages? If not, it’s not accessible to whomever can’t understand the language you use. Sign language? (some of the Deaf have a lot of trouble with English-as-a-second-language, or whatever your spoken language is). What about illiterates?
Is it fully available to users with little bandwidth/slow connections? This practically requires a plain-text document, which for most sites is a little impractical (and if you do serve a text-only version, you’ve got all those associated maintenance problems… and how do you offer/serve that text-only version?).
Somebody, somewhere, is not going to be able to use your site. Possibly a lot of somebodies, for reasons you cannot possibly cater to.
But this is why you’ve got an accessibility standard you have to meet: it tells you where you’ve got to work and where you must just leave alone.
Though if you’re in the US and doing Section 508, I gotta say: it’s got some rules in there that can hurt accessibility/usability. They are based on the old WCAG1 documents. I’ve seen sites that technically follow 508 to the letter (last one I looked at was State of Michigan website, though this was about a year ago) and it was not terribly user-friendly or necessarily very accessible. For example, alt text was filled in for images, but the text itself was kinda like, wut? All the 508 checker did was check if alt text was filled in.
A good U/A company will check that the alt text makes sense in its context, for example.
Someone should go through your site with a screen reader (or two), a screen magnifier, maybe a refreshable Braille device. Keyboard-only, Javascript on or off (WCAG2 is much more lenient about scripts… you may have them, but frankly don’t allow the lack of them to break anything on your site). Images on or off (mobile devices may block to preserve bandwidth). Printing?
Also, you may want to talk with Rguy84 here on the forums. Maybe I’ll twot this thread to him; he has to deal with gov’t accessibility sites himself.