I have only recently come across the term of A/B testing.
From what I gathered it's creating 2 instances of a webpage and testing which one outperforms, from this you'd endlessly improve your conversions. I am a little new to this so forgive any amateur style comments.
Taking a look at unbounce and Visual Website Optimizer which help you in this. Personally I dislike wysiwyg editors, they take the fun out of web design. Many of those sites had their own visual editors embedded which put me off the entire idea.
Another issue I struggle to understand is that A/B testing involves 2 variations of the site, would this not automatically hurt your S.E.O. factors, as you have a duplicate version of your website.
I am so new to this, some of the websites offering A/B testing are charging crazy figures! Maybe there is something I missed here, if anybody knows a little more on what this is and how this works it would really help us understand more.
That's pretty much it – you randomly switch between the two versions of the site for different visitors, and you track what happens ... do people stay on the site, do they follow a smooth path to their goal or are they bouncing all over the place randomly, do they convert to a sale? Focus groups and user testing are all very well, but nothing beats real live people for seeing how well a particular design works in practice.
As far as SEO and search engines are concerned, there's only one version of the page. People don't see whether they've gone to example.com/version-a or example.com/version-b – as far as they are concerned, they've just gone to example.com. You would expect the differences between the site to be fairly subtle, and the general content and theme to be the same. So while search engines might notice slight differences between one visit to the next (if at all – it's common practice to log users and serve them the same version each time they come back, to avoid confusion), it would look like minor tweaks being made to the page rather than a whole new different page.
I thought this was it.
I see that Google offers a free alternative. http://www.google.com/analytics/features/conversion-suite.html, which is nice of them, not entirely sure if this offers A/B testing.
Looking at some of the alternatives they somewhat appear pricey. I would struggle to justify those prices to clients. 50 dollars per month on a package does not sound bad at first, however, that adds up to 600 dollars per year.
Is there not a cheaper way to implement this, and why is this so expensive. I normally look at bringing the price of services down by going for 'bulk' options, overall reducing the price for the client and gaining some profits.
I work with Visual Website Optimizer and I'll attempt to provide some explanations here (pardon me if I get too salesy)
The nefarious WYSIWIG editor
Just a little background, an A/B test is when you create two versions of an element and divide visitors 50/50 between them. Simple example: Red "Submit" button and green submit button, and then see which one results in more "submits".
Split URL testing = You create two versions of a page where the goal page is the same. For example, your website is www.example.com and you have a new theme at www.example.com/new so you split users 50/50 between your original site and the new theme. Then, you measure which one leads to increased sales, which you track by when a customer reaches the Thank You page (let's assume www.example.com/checkout/thank-you.php). The best part about a Split URL test? You don't need to go through a WYSIWIG editor.
About that, here's some detail: the WYSIWIG editor is there because most often, it is marketers who handle A/B testing. However, we know that some people don't like them and so have the Advanced Code Editor. Using that, you can make changes in HTML, CSS and JS without any wysiwig irritation. Also consider that when you're doing a lot of testing for many clients, you'll appreciate that you don't have to dive into the code everytime a variation has to be created.
The deal with Google Search Rankings and A/B testing
Google cleared the air about this through their blog post http://googlewebmastercentral.blogspot.in/2012/08/website-testing-google-search.html
Most important bits from that post are:
Cloaking—showing one set of content to humans, and a different set to Googlebot—is against our Webmaster Guidelines, whether you’re running a test or not. Make sure that you’re not deciding whether to serve the test, or which content variant to serve, based on user-agent. An example of this would be always serving the original content when you see the user-agent “Googlebot.” Remember that infringing our Guidelines can get your site demoted or removed from Google search results—probably not the desired outcome of your test.
If you’re running an A/B test with multiple URLs, you can use the rel=“canonical” link attribute on all of your alternate URLs to indicate that the original URL is the preferred version. We recommend using rel=“canonical” rather than a noindex meta tag because it more closely matches your intent in this situation. Let’s say you were testing variations of your homepage; you don’t want search engines to not index your homepage, you just want them to understand that all the test URLs are close duplicates or variations on the original URL and should be grouped as such, with the original URL as the canonical. Using noindex rather than rel=“canonical” in such a situation can sometimes have unexpected effects (e.g., if for some reason we choose one of the variant URLs as the canonical, the “original” URL might also get dropped from the index since it would get treated as a duplicate).
Use 302s, not 301s.
Only run the experiment as long as necessary.
The amount of time required for a reliable test will vary depending on factors like your conversion rates, and how much traffic your website gets; a good testing tool should tell you when you’ve gathered enough data to draw a reliable conclusion. Once you’ve concluded the test, you should update your site with the desired content variation(s) and remove all elements of the test as soon as possible, such as alternate URLs or testing scripts and markup. If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if you’re serving one content variant to a large percentage of your users.
Tools of the Trade
Free tools are:
- Google Content Experiments - This is Google's free solution. Currently, it only allows Split URL testing. See http://support.google.com/analytics/bin/answer.py?hl=en&answer=1745216 on how to start a new experiment
- MaxA/B - I noticed that you do a lot of work using WP, so I think this should work really well for you. It's a free WP plugin for Split URL testing (even though it calls itself MaxA/B). Here's the link http://wordpress.org/extend/plugins/maxab/
- A/Bingo - A/Bingo is a free RoR solution most famously used by Khan Academy. Link: http://www.bingocardcreator.com/abingo
- Genetify - Free JS based A/B testing. Project home page is https://github.com/gregdingle/genetify/wiki
Paid tools abound. WhichMVT.com has a good comparison.
Why pay so much?
Because this stuff will make your clients bucket loads of money. No seriously, A/B testing tools increase the rate at which websites convert their business goals by a huge margin. You should see the Visual Website Optimizer case studies to see how simple changes have huge impact.
Also, if you'll care to, please have a look at an article I recently wrote for Webdesigntuts+ titled "A Web Designer’s Introduction to A/B Testing". It can be found at http://webdesign.tutsplus.com/articles/general/a-web-designers-introduction-to-ab-testing/
I hope this helps. Feel free to ask any questions. I'll happily answer them without any of the sales mumbo jumbo people in my line of work usually resort to.
Of course! What you're talking about is a split testing platform which, when done right, is far more than just dynamic serving as it also maintains a record of performance in order to guage the effective lift of each option while allowing for deployment from a single, web-based location. There's many tools that do this but if you already use Google Analytics the most logical is Google Website Optimizer which can be found under the Content section as "Experiments".
Note: You can also run more advanced multi-variant tests but don't get ahead of yourself too fast. Most gains can be seen with A/B or A/B/C/... methodology. The trick is to test with rigor, wait for near statistical accuracy [5 sign ups does not a trend make] and then try to understand why.
Loads of them actually. Free ones are Google Content Experiments, MaxA/B, Genetify, A/Bingo. For paid tools, http://whichmvt.com has a good comparison.
Thanks for the great reply. I must say that you've certainly covered a lot more topics than I would have intended. A/B testing appears to be very important. Thanks for clarifying the Google issue with having duplicate content.
Is this only used on landing pages or would this apply to a traditional homepage.
Is having two variations of your website or landing page not confusing. Also can you have more than two variations, or would this go against your typical A/B split testing.
I guess it's not just about S.E.O. after all, now we've opened a whole new area of marketing. :)=
Thanks Stevie, your explanation did settle my initial thoughts on what A/B testing is. Pretty simple, but very powerful when used correctly.
A/B testing can be used on any step of the buying funnel. Not only landing pages, but also homepages, product features, different sales promos, pricing structures, checkout pages, trust badges, form lengths, colors, website copy.....basically, almost anything that affects visitor behavior on the web can be A/B tested.
And you can certainly have more than one variation, that's what we like to refer to as an A/B/n test. For example, I'm running a test with 4 variations of copy for the primary call to action button on the Visual Website Optimizer homepage.
We've used both VisualWebsiteOptimizer (VWO) and the Content Experiments supplied within Google Analytics (GACE) to test whether a trust badge has an effect on product sales, newsletters signups as well as general enquiries. I've also used Unbounce a bit.
VWO is good software. It is relatively easy to use and the WYSIWYG is good if you have a simply coded website. I haven't used it lately on our eCommerce sites as BigCommerce withdrew direct integration with VWO (for whatever reason) and kept Google Analytics Content Experiments (GACE). The reporting is also good and offers quite a bit of depth. I would use VWO again if running a reasonably simple website. VWO staff might have something to comment on that.
Unbounce is very good for smoke testing (Google "smoke testing") concepts. For example, if you had XYZ to sell and you wanted to see what percentage of visitors from Bing Advertising would sign up for the XYZ product, and how much that would cost you ... you would use unbounce. It is quick and easy to use. A designer can easily customise the templates so they look better. But it's great for running 10 or so iterations of a page just using the Keyword insertion to create differences. I would only use Unbounce for single landing pages.
But! At the moment we are purely using GACE for split testing. It's super simple to understand as we simply create two separate versions of a page, or just add in one new trust badge to a page, and then run the tests. The reporting is all recorded in Google Analytics which just makes one less login. So even though we're technically capable of using more complex products we end up using GACE cos it's super easy.
As to the concept of A/B split testing I highly recommend it to anyone. In fact, I think absolutely everyone should be testing even if only for simple things. For example, adding a Trusted Website badge has lifted online enquiries for Alsco dot com dot au from 3% to 20% on some days (happy to post proof screenshots). Overall the increase has been around 6% to 8%. This was achieved simply by placing a Trusted Website badge next to the Enquiry button.
We are also testing the same badge on an email newsletter subscription and adding it next to an 'Add to Cart' button. We're expecting to see similar increases in uptake.
GACE is great for these simple tests, but GA overall is good for testing at deeper levels. For example, on some of our larger sites we can actually test conversion rates based on which server we are using to distribute content. Subtle differences but magnified when the site has millions of unique views.
So ... I would go with the Content Experiments in Google Analytics to start with. Get your head wrapped around the simple changes you make to your website to see how they affect your 'action' button. You want more email subscriptions? Change the colour of the button for the first experiment. Leave it at that. Then add a 'Trusted Website' badge for the second experiment. See what happens.
Just keep it simple, but definitely do it.
I'm new to SEO. I have2 queries
1)A/b testing is applicable only for E-commerce site, if not how you perform in normal site
2)For website without having signup or register, how(what type) A/b testing is performed
It depends what metrics you use to determine success.
If your website has no commercial function then your success criteria may be to reduce bounce rate, increase engagement, get comments, navigate sensibly around the site, get repeat visitors, that kind of thing. Whatever you consider as "success" is what you need to measure in your A/B testing.
This topic is now archived. It is frozen and cannot be changed in any way.