Design & UX
By James Edwards

Don’t make users take responsibility for our problems

By James Edwards

You know what, I really hate CAPTCHA.

The other day I was speculatively signing up for a Facebook account (I’m not particularly interested in being on Facebook, I just wanted to have a nose around its code), but signing up was a tricky process. You didn’t correctly type the text in the box, it said, referring to the security check image of two words you have to type-in to confirm you’re a real person.

Yeah sure, except I can’t see any words, all I can see is a message that says Loading….


Now admittedly that’s a bad example — presumably they’re using some dodgy Ajax which doesn’t work in Opera for Mac (my browser of choice), and usually such images are generated server-side without the need for scripting, hence this problem doesn’t occur.

But I still hate them because CAPTCHA tests are an accessibility black spot. What are you supposed to do if you have a reading or cognitive disability and simply can’t make them out? Man, I have perfect 20-20 vision, and more often than not I can’t read the damn things; it’s very common for me to have to make three or four different attempts before I get it right.

(I might also point out that a CAPTCHA is not a true Turing test — it only tests cognitive ability, not intelligence; so not only does it fail for some humans, it can be read by a machine with suitably sophisticated hull-scanning technology.)

Some CAPTCHA systems offer an audio alternative, and that’s certainly an improvement, but it’s still not good enough. What if you don’t have a sound card? What if you’re blind and deaf and can only receive information through a braille feedback device? What if, as with the visual test, you simply can’t make it out? CAPTCHA sucks because it creates a barrier for users where there needn’t be one.

Needn’t be one?, I hear you exclaim, how else do we protect against bots! Well, now I’m going to have to step back from the practical to the more conceptual point, which is the nub of why I really hate them so much. Using a system like this is making users take responsibility for our problem. Bots are our problem, not the users’, and it’s totally unfair to pass the buck.

A similar example is those email opt-in services, where before you can send a person email you have to go to some website and confirm that you’re a real person, give a reason why you want to contact them, and in many cases pass a CAPTCHA as well. This, again, is not fair — it’s passing responsibility for spam onto legitimate people who want to contact us, and forcing them to deal with a problem that’s actually ours, not theirs.

So please, let’s not pass the buck – don’t make users take responsibility for our problems.

  • mactheweb

    I agree with all your points James, yet still use captchas. Dealing with community forums and blog comments without them is nearly impossible. If you have a better way of slowing down comment and post spam to an easily manageable level I’d love to hear it. I’ve tried subscribing to lists of known spammers, I moderate, I programatically throttle comments, but that’s not enough.

    So the question is whether the captcha, which keeps out a small percentage of visitors is less evil than cutting out comments for everybody? Yes, turn off comments, now that’s a real accessibility problem. Or do I make everybody sign up? In my experience that has a tendency to stifle conversation. A popular site like SitePoint can get away with it but a new site starting out will have its growth seriously slowed by requiring registration. And, I could go on and on about the evils managing too many passwords, which for most users is less of a pain than captchas.

    Yes, captchas are a far from perfect solution but the alternative seems to be requiring subscriptions or killing two way conversation altogether. Am I insensitive to handicaps? Far from it. I have some mobility problems myself and appreciate accessible buildings. And I recognize that the situation isn’t inherently unfair, it’s just that many people simply can’t afford to do that.

  • Have to say I’m with mactheweb — I hate CAPTCHAs as much as anyone, and the Facebook ones are particularly difficult to read even for an able-bodied user. But what do you suggest as an alternative?

  • turb


    at least give some alternative, instead of just scream out loud. Fine, Captcha are not the best way to go, but when well done, it is a easy alternative for most of website.

    It is so easy to complaint about a thing, but harder to bring solution. For me, your text would have been good if at least you were pointing to at least one alternative.

  • mactheweb. There is another solution and I find it works extremely well on a per-site basis. Create questions based upon your niche, and randomnly ask them where you would put a captcha. For example, if you run aimed at men, ask them something like “What kind of hair do men have on their face? Hint: f***al”. Or if you run a site about fishing, ask something along the lines of: “What does a fisherman put on his hook to entice the fish to bite?”. Long as you ask questions based on your niche, or questions that you feel no-one else would ask, and as long as you structure your questions in a unique way, I’ve yet to see bots get around my defences!

  • heggaton

    Personally, I don’t use them. That said, I’ve never needed to either.

    One thing I do is generate a unique ID when someone goes to a form page. I then add it to the user session and display the form – with the unique ID as a hidden element.

    When the user submits the form, if the ID’s don’t match or there is no ID (or session), it fails.

    That said, I don’t think this would work for big sites like FaceBook or My Space – as people build bots that specifically target those sites. It does however work perfectly for feedback forms and the like for smaller sites. And it doesn’t affect the end user one bit!

  • Jeremy

    My favorite captcha (besides the Hot or Not mashup captcha, which fails in terms of accessibility, but is way more fun than any normal captcha) is the one a friend of mine implemented on his site. He prints a message “add 5 and 7 and enter the total”. Obviously a bot could be trained to overcome this, but that hasn’t happened yet, and it’s completely screen-reader friendly.

  • Katrina Youngman

    I have seen a couple of sites which use short little questions like: “1 + 1 = “. This would have to be my favorite method for separating real users from bots.

    Its easy to implement, easy (and quick) for users to complete and if implemented well, should be accessible by screen readers etc. This also saves from people having to give out their email address or sign up for an account.

  • There are numerous alternatives to CAPTCHA’s, most of them at least as effective. Some of them are contentual, such as the questions suggested by Dean C, and some are purely technical.

    One is creating a text field, hidden by CSS, with a name that will make a bot to fill it (e.g. email2 or something similar), but the humans would not as they wouldn’t see them in the browsers. A colleague of mine has a simple checkbox that says “this is not spam” and it’s generally left unchecked by bots; a variation could randomly change this into negation (like “this is spam” and you have to uncheck it to verify).

    If your site requires Javascript to work anyway, you can easily employ it for testing for bots as well. For example, you could have a hidden field’s value altered via onclick on the submit button (bots don’t click), or a simple confirm() dialog on form submit.

    Of course, each of these techniques has its own disadvantages, but they are much less usable than a CAPTCHA image. You have to take into account the requirements of your site/application, do some testing and then decide what to use.

  • I’m not necessarily into defending CAPTCHAs, but while it’s true there’s a cost to the user, allowing spammers and their bots easier access to your post costs users too.

    To give it a biological bent: The spambot is basically a parasite exploiting the ‘host content’ and like all parasites, it lives by siphoning off the health of the host while providing nothing in return. If the parasite isn’t controlled — and the cost becomes too much — the host will deteriorate until it dies. We’ve probably all seen the gory aftermath of a decent blogs overrun by rampant spam.

    Ultimately that’s at a cost not just to the writer, but all their users too.

  • Php_penguin

    a simple solution is to provide a 3×3 box of varied pictures and say “click on the duck” or similiar. However I am not sure how one might convey this information for blind/blind-deaf people… but it sure is easier for fully-abled people.

  • nadiaMQ

    I had the exact same problem with facebook. for a while I thought I was just plain stupid but now I read your blog and its refreshing to know that others experience the same problems!

  • Anonymous

    you could have a hidden field’s value altered via onclick on the submit button (bots don’t click)

    Neither do keyboard users though…

  • wwb_99

    @Anon: keyboard users do click, or at least should fire the javascript onclick event. Spam bots usually just mechanically submit the form via POST, so javascript is a very effective defense.

    I have recently fallen upon AJAX to handle our “tell a friend” or “contact us” tasks. It seems to be holding up well against bots, while being a better user experience for most users. Neither function is necessary (we always provide an email and a phone number too, so we can be contacted via normal means), so I don’t have too many accessibility heartburns.

    The real crying shame of these situations is that the reasonably easy way to solve the comment spam really can often break accessibility, leaving a developer to choose between enabling spammers or being a bad human being.

  • Vincent

    So what do you propose instead of captchas to prevent spam?

  • WarpNacelle

    I didn’t create SPAM bots and use CAPTCHA to counter them. How’s that my problem?

  • Spam is worse than captchas but both of them are very annoying. There are some helpful solutions to decrease spam. You won’t get rid of all spam anyway and captchas are not bulletproof.

  • jboehman

    Try Akismet as a CAPTCHA alternative:

  • How about SAPTCHA (Semi Automatic Public Turing Test to Tell Computers and Humans Apart) instead of CAPTCHA?

    For example, there’s WS-Gatekeeper for WordPress blogs:

    or NoSpam! for vBulletin:

    A couple other alternatives are outlined at:

  • Email confirmation is necessary to confirm that it was you who signed you up to a service (a newsletter).

    As for CAPTCHA, I do not use them, I prefer to use something like akismet.

  • Yeah Eric Meyer’s gatekeeper is the best alternative I’ve seen – it’s still not perfect because it places a cognitive load on the user (even though a tiny one) and because it is eminently breakable by brute force, hence to make it work you’d really need a unique set of frequently-changing question for each site.

    The “honey pot” idea is also a nice one – having a field hidden with CSS that only bots will fill-in; there’s still an accessibility problem there for non-standard devices that will see it. But I think it’s still better than CAPTCHA.

    btw – turning off comments entirely is not an accessibility issue, because it’s the same for everyone. But comments aren’t really important anyway – it’s mission critical uses like signing-up for an account in the first place, or verifying financial transactions, that I object to (and also, ironically, those are where it’s most necessary to have bot protection, but there you go!)

    Bots are “our problem” because it’s up to us to find a way – we’re part of the same production chain, in a sense, whereas users are consumers. We benefit from users on our site more than users benefit from being on it, hence the prerogative is ours.

  • pixelsurge

    James, I completely agree. Standing ovation for you.

    I’m personally getting by with the hidden field using CSS trick right now. The label on the field says “Leave this field blank” so even people who do see it should leave it blank. And their submission doesn’t get discarded if they do fill something in — it just throws an error, so they can still go back and fix it.

  • aj510

    It’s all good and well to say capthca is bad because of this and that. But have you seen what bots do to a forum/blog/toplist/etc that is left with no captcha or any form of protection against bots?

    100’s of thousands of posts can be generated on sites in a matter of months. It is ridiculous to think requesting users to enter a small code is any great task. After all this is also for their benefit too.

    I use captcha and also the hard coded questions on my sites and find that the combination stops spam bots dead. If you can come up with a better solution, then by all means let us all know.

  • steve_friend_of_brothercake

    CAPTHAS – as defined by the acronym, are GOOD.
    What is thrown up by sites with the stupid graphics are bad, and often inaccessible.

    I use a combination of session control and “toggle” fields as described above with 100% success.

    I also use some POST content analysis (Scan for urls and rejected words) to reject some of the HUMAN bots which seem to be coming out of CHIANDONG Province and some Malyasia. I believe it is some kind of agent software which is able to RECOGNIZE that captha is in place, and prompts a human so that THEY can enter the value and allow the post


    Spam wasn’t a problem with the cold fusion websites I coded until the past year. I first tried the session method with little success – in fact it didn’t seem to filter out anything. Like steve_friend_of_brothercake, what has worked has been the “honey pot” method and content analysis.

    For the “honey pot” method, I use a hidden css div layer to hide 4 fields (IamSpam, MyEmail, MyComments, and a random character field like w3Rtp) that are then checked upon submit. To answer the question of accessibility in the “honey pot” method, the hidden div is placed BELOW the submit button and is preceeded by the statement “Anti-Spam fields hidden with CSS: Please DO NOT change the information in these fields.” The text is in the hidden div as well so only users with stylesheets turned off or a screenreader will come across them.

    The code that I wrote to analyze the content of the form fields looks for the same data in more than one field (bots commonly enter the same data more than once!) AND it looks for any type of url (http://, a href, etc.). The key to the content analysis is making sure that the options available to the user through the form do not give them a chance to submit duplicate data or urls. If so, then those fields must be left out of the check.

    So far it has been 100% effective and I now have it in include files so I can easily implement it in a new site. I’ve been meaning to create a cfc using the same code but just haven’t done it yet!

Get the latest in Design, once a week, for free.