Ticketmaster abandon captcha in favour of asking questions

The world’s biggest online retailer Ticketmaster, have abandoned captcha in their online forms, in favour of asking questions and other easier-to-use options.

I’ve always hated having to use captcha - could this be the start of its demise?

Nice article!

I’m with you in hating captchas - those things are a real barrier to most, if not all users.
Let’s hope this is the shape of things to come.

@Stomme_poes ; will also no doubt be pleased to hear this.

Asking questions is still a CAPTCHA. Anything that tries to distinguish between real people and bots is a CAPTCHA.

My favourite type of CAPTCHA is one that looks at how long it takes to fill out the form since even fast typists can’t fill out the form as quickly as bots do. With that type of CAPTCHA the person filling out the form doesn’t even know that the CAPTCHA is there unless they really can type as fast as a bot.

I didn’t know that. The timing solution sounds very attractive. How is the time measured?
Page load -> submit?
Page load -> page close?
Mouse-down on a form element -> submit?

Between what points is the measurement made? and how?

Regards

No matter, I’ve found the info I was looking for - here’s a list of alternative CAPTCHA methods (list found here)

Timed Load: This method simply keeps track of the load time of the form, then compares this to the submit time. If the form was loaded and submitted in less than say 1 second, then you can be pretty sure the form was submitted by a bot, and the data should be discarded.

Hidden Fields: Sometimes known as ‘Honeypot Traps’, these are empty hidden fields which can be added to a form and checked for filled values on processing. This can be done using the hidden input type in HTML, or by using CSS to hide the input from the user (but not from bots).

Tokens: Generating a one-time token for a user on form load, and comparing this when processing the form. This blocks externally made requests to the form, (a common method for bots and malware).

Dynamic Form Loading: As the vast majority of bots cannot parse JavaScript, critical components of a form (or all of it) can be added to the webpage on load, this can be made a breeze with the use of a JavaScript abstraction library such as jQuery.

Submission Confirmation Page: This method involves adding an extra page between the form and a submission, where users are asked to confirm the details they have entered. This method works because most bots are designed for single page entry and submission of form data. Implementing an extra page in this manner for every form does force extra interaction from the user, but has the added benefit of giving them an opportunity to verify the information entered, which can lead to more effective data capture.

Image CAPTCHA: Use easily recognisable images and have directions such as ‘Click on the Monkey’. To an extent this solution is moving the problem (as we are still asking the user to commit an action to prove they are not a bot) but it can reduce the affect on usability if done well.

Math CAPTCHA: Ask the user to answer randomly generated (but simple) maths questions In most cases this will be simpler than standard CAPTCHA, however this still requires the extra user interaction, and some bot makers are becoming wise to this method due to it’s popularity.

Variable CAPTCHA: Use a standard CAPTCHA solution but only use this when activity is suspicious (multiple submissions from the same IP address for example).

Log everything / Discovery: This is not an alternative as such but a way for you to possibly discover patterns with bot form submissions which could allow for simple filtering rules for form submission data. (‘Log everything’ is always a good mantra when working on the web).

steeples fingers

“Eeeexcellent, Smithers…”

Each of the captchas listed above have their pros and cons. Anything based on images potentially hurts the visually impaired, while simple questions like math not only could easily be done by any bot, but can hit people with cognitive disorders (though I would suppose even those with dyscalcula could add 2 and 2… then again, I’ve managed to get 5 myself…). This old page lists some ideas too (the honeypots, timing and tokens again, and then random hidden input hashes and field names too).

Probably combining techniques on a page, and then some user testing or having some people testing various edge cases or AT would be a good way to go. Or have everything sent to spambox be something you still check manually (depending on your volume).

One thing to keep in mind about timing: While the way felgall mentioned it is usually the better way (bots are faster than humans, whereas humans, especially those with disabilities, can be very, very slow), be aware of instances where someone spends either a lot of time sitting on the form page (maybe reading it, or it’s open in another tab, or the kid starts screaming), or spending a lot of time on a thoughtful comment, and then for whatever reason the page session fails… so the user ctrl-a’s their response, reloads the page or loads it in another tab or browser, and pastes the response, and hits send. They’ll be much faster than (normal) humans in that case, but they are not a bot. So consider time from page load to submit with a possible assumption that someone may be pasting an answer but still filling the other parts of the form by hand (or the browser may auto-fill those).

Depending on your page/site, you may want to be accessible to those not using Javascript (if your page in general requires javascript, then probably not). Also if your site is targetted at mobiles, you can use Javascript but may not want a heavy library like jQuery. Try jQuip or any of the for-mobile javascript frameworks if you want ease-of-implementation.

Wouldn’t it be safe to say that bots would head straight for the form, fill it and submit it, probably in less than one second. If the time limit was set to say, 2 seconds (page load to submit), a human user would be unlikely to beat that even if the form was auto-filled. Even if they did, an explanatory note on the ‘failure’ page would be in order.

I don’t understand what you mean re. sessions. Wouldn’t that be the case for any page that uses a session, regardless of whether a form was included on that page or not?

Regards

“Sessions” was probably a bad word… I have many times where I cannot submit a form for some reason and needed to reload the page or open it in a new tab to post, and I couldn’t remember the reason why I had to do that. Also there are some forms where there is a limit to fill it in which does actually use a session (the Netherlands’ government digID sites do this, to prevent someone leaving a computer unattended and getting to their tax info? Or to prevent there being enough time for any possible MITM attack? I’m not sure the reason, but I’ve seen before people being recommended to “time-out” a form if filling it out takes too long for security reasons. Usually this is bad for the general public since those using AT tend to fill out forms much slower than users without disabilities or with the newest hardware/software).
If I take too long filling out a form that requires me to be logged in using my digID (also welfare/work site uwv.nl requires this), I’m forced to re-log in. Logins only last about 15 minutes, but for example on the uwv.nl site this 15 minutes can be made longer/renewed when you make a new page request.

Yeah, 2 seconds is probably safe, 5 seconds might be better. You can probably look at your server logs to see time differences between form page requests and submits, and see what the fastest ones tend to be.

I’ve only triggered a “FAIL U might B a bot lol” message once.

Off Topic:

hey, the earlybird gets the worm, but the 2ndmouse gets the cheese

ok thanks - I follow your thinking now. I might try it for a while on one of live forms, and as you say, check the logs.

By the way, you’re the first to cotton on to my user name, but hey, it’s an old adage.

Cheers