3 A/B Testing Tools Compared

By Jacco Blankenspoor
We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now

In my recent article that covered Google Analytics alternatives, someone asked a question in the comments about A/B testing. In response, two tools were mentioned: Optimizely and Visual Website Optimizer. In this post, I’m going to review these in addition to another one called Google Analytics Content Experiments, which offers simplified (but free) A/B testing functionality as part of the Analytics suite.

If you’re new to A/B testing in general, I recommend Kerry Butters’ articles Are Most Winning A/B Test Results Misleading? and The Designer’s Guide to A/B Testing, both of which include more basic info on what A/B testing is and why it’s important.

Now let’s go on to the reviews.


Let’s start by looking at Optimizely. They offer a decently priced entry plan with 2,000 visits per month, which is enough to run a few tests (or one large test). Optimizely requires you to insert a snippet of code in your header (which gets confirmed by email, which is nice) after which you can use their dashboard. You just add a new experiment, and a guide takes you through all the necessary steps.

I prepared an experiment to test whether Google Adsense link units should be put below (how it currently is) or above the navigation links. You can see the real page here to get an idea.

I can easily select the related code and make the switch. This is just a test, but if it was a real analysis I would also need to generate new Adsense code to measure the impact in earnings. But for now it’s the concept that matters.

Optimizely in action

If you run a very popular site, you can add one or more conditions to filter your traffic (like coming from a specific URL), as shown below:

Filter traffic with Optimizely

You can also set up goals to show up in your report, but I find the standard set very limited. You can compare on the basis of clicks, pageviews, or custom events that you have to set up yourself with JavaScript.

In their sales pitch, Optimizely states that you don’t have to be a coder to get it working. But if you want to do some serious tracking, some coding is still necessary. I think a more visual approach would be preferred. Coding shouldn’t be necessary after you’ve already placed their tracking script in your header.

Let’s say you want to compare different positions for a newsletter signup box, measuring the signup rate for each position. With Optimizely, you will have to insert some JavaScript code behind your form in addition to the tracking script. This seems redundant.

What about the results from my test? Well, this is the report I was shown.

An Optimizely Report

The results are still being collected, but this doesn’t really tell me anything since it isn’t clearly defined what these conversions are (probably clicks). This is just too limited.

Optimizely allows for a lot of integration with heatmap tools like CrazyEgg and ClickTale, and with various analytics tools (including GA). I think at these pricing levels at least some of that functionality should be included out of the box.

Optimizely looks impressive on paper and can certainly be a helpful tool if you don’t mind coding things. But I wish it was more click-and-play so you can stay focused on the testing.

Optimizely offers a free 30-day trial.

Visual Website Optimizer

Now let’s look at Visual Website Optimizer (VWO), which claims you won’t need to code at all. Their pricing is quite steep, but there’s a free trial. VWO begins by taking you through the steps needed to set up your experiment. After that you are given the code to insert, but you can also use plugins for most popular systems like WordPress and Magento.

I made the same change to my AdSense link units as with the Optimizely test by editing the HTML. There are a few more ways to modify your page. The easiest way for my example would be using the “Rearrange” function, but that wasn’t working with the AdSense code.

Visual Website Optimizer in action

After selecting and changing the HTML, I need to choose what I wanted to test. As you can see, I can be very specific about what to track when it comes to clicks on a link (if the test code allows for it). And there is a way to track signup forms. As with Optimizely, you can filter your traffic based on a whole range of conditions.

VWO filters

Choosing Current URL in VWO

A nice feature of VWO is that it comes with an integrated heatmap, thought it’s not as advanced as a dedicated heatmap tool. Too bad you can’t compare both heatmaps in your A/B tests. And they aren’t very advanced either, since the coloring doesn’t really tell you much. But it’s a nice idea that could improve with a little more development.

VWO's Heatmap Feature

VWO comes with a summary report and a detailed report, but to be honest these should be integrated with each other. The detailed report just adds some graphs and filters, but nothing really in-depth. The reports you do get are enough to help you in analyzing your results and use as a basis for further actions.

VWO's Summary Report

You get a little more information with VWO than with Optimizely. Engagement in this case is measured by overall clicks. Unfortunately I wasn’t able to track the actual clicks on the AdSense ads in the end, due to it being in an iframe.

Visual Website Optimizer gives the click-and-play experience I was missing with Optimizely, and gives you enough ways to measure how visitors respond to changes. The heatmap functionality is nice but not very useful in it’s current form. And even though reporting is a bit more than basic, it still feels too simple.

Visual Website Optimizer offers a free 30-day trial.

Google Analytics Experiments

To conclude this tools comparison, I will have a look at Google Analytics Content Experiments (GACE), previously known as Google Website Optimizer. GACE is a free A/B testing tool that’s part of Google Analytics. It only allows for pure split-testing, and you have to make the different variations yourself (i.e. set up new pages).

Google Analytics Content Experiments

To use GACE, you start by setting up your experiment and defining your objectives (which can also be predefined goals in GA, which is very helpful). You can also run experiments on Adsense results. Then you just have to define two or more testing pages (one being the original), insert some code it’s ready. You can even use your sales funnels and start split-testing these.

On one of my other sites, I have published a Liquid Web review. As you can see, there are prominent blue boxes displayed with a clear call to action. Changing colors requires you to copy the original page, and make the modifications yourself (like I did here). Also, you need to make sure your alternate page isn’t indexed by itself.

After you set up your experiment, you will immediately experience a major downside of GACE: You have to wait for 1-2 days to see results, unlike the two other tools, which are real-time. This means you can’t instantly act on the results coming in. With A/B testing it will definitely help to test over a combination of days, but if you are testing minor tweaks you’ll want to do a few variations per day so you can be sure which changes affected your conversions.

GACE doesn’t allow you to filter your traffic when setting up your experiment, you can only choose which percentage of your traffic should be included in the test.

Another (unnecessary) disadvantage of GACE is that it changes your URL to track the experiment:

GACE's tracking URL

While this doesn’t present major problems, it just isn’t very clean and also not very convenient when someone wants to bookmark your link (like I do a lot in Evernote).

GACE does give you the best reporting results in this comparison, because it allows you to compare based on different metrics. Even the ones you didn’t initially set up are shown, but a winner is defined on the metric you put in as objective.

GACE's Report

The Content Experiments functionality is very well integrated in the whole GA suite, which offers you some benefits over the other tools. When it comes to the testing itself, it’s rather limited and more time consuming to set it up. It also requires you to be patient. But it gives you an easy way to start with A/B testing without spending any money. And the AdSense integration is very valuable if you run an ad-supported site.


In summary, even though these three products seem to offer the same functionality, the way they handle things is very different.

Optimizely gives you lots of possibilities if you don’t mind coding some stuff together, while Visual Website Optimizer is more click-and-play. VWO comes with an extra tool for heatmap tracking, which is a nice feature but too limited. You can use both tools for both basic (changing elements) and advanced (filtering) experiments. But when it comes to reporting there is still room to improve. Documentation for both products is excellent, even if you just want to be educated on the different concepts of A/B testing.

With both products, it still feels like I’m missing important information though. Let’s take visitor origin for example. I can exclude certain segments of visitors so I can tell where they are not coming from, but wouldn’t it be nice to see if UK and US visitors are responding the same way to a change, and see that in a chart? Now it requires you to run two separate tests, and use a separate analytics tool. Maybe I’m asking too much here, but I believe there’s lot of potential in these tools since they already know so much about your traffic.

Both offer a free 30-day trial, so I encourage you to run some tests yourself.

Google Analytics Content Experiments is fully integrated with the rest of the GA suite, but is rather limited in the way tests are performed. It also requires more manual actions to make the adjustments, and you need to be patient before seeing results. But once you’ve set up a nice set of goals or funnels, it’s a great (and free) tool to use.

If you’ve used any of these or know of another tool, please let us know in the comments.

We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now
  • Zach Edwards

    Great post, Jacco! Your experiences closesly mirror mine, but based on the little I’ve seen, the new VWO that is being released this month is going to really blow away the competition: https://vwo.com/comingsoon/

  • Jacco, thanks for reviewing Visual Website Optimizer. Wanted to clarify a couple of things regarding the tool:

    – Regarding heatmaps you said: “Too bad you can’t compare both heatmaps in your A/B tests.” Actually you can do that! If you select different versions from the dropdown in the heatmap window, you will see it’s available for all variations in a test. This enables you to compare versions visually (something that a nice addition to hard data you get from reports).

    – Again regarding heatmaps, you wrote: “And they aren’t very advanced either, since the coloring doesn’t really tell you much”. Perhaps you missed but there’s an option in the interface called “clickmap” which will give you click statistics for different elements on the landing page. For example, you could see how many clicks did a particular element get. In fact, Visual Website Optimizer has a unique functionality called consolidated heatmaps that allows you to consolidate clicks across the pages and generate a combined heatmap for common website areas such as navigation, footer, etc. More information here: http://visualwebsiteoptimizer.com/split-testing-blog/consolidated-clickmaps-and-heatmaps-a-new-method-for-analyzing-visitor-activity/

    – “Let’s take visitor origin for example. I can exclude certain segments of visitors so I can tell where they are not coming from, but wouldn’t it be nice to see if UK and US visitors are responding the same way to a change, and see that in a chart?”

    This is a very relevant criticism of existing A/B testing tools and something we fully realize. You will be happy to know that in April 2014 we’re launching the next generation of Visual Website Optimizer called VWO that will take A/B testing to a whole new level. While I can’t reveal specifics yet (since it’s not launched), but I’m sure when you will try it, you will see that we’ve added quite a many features while still retaining the simplicity.

    If you want to check it out, we have a teaser page out already: https://vwo.com/comingsoon/

    Thanks again for a fairly detailed review.

    Paras Chopra
    Founder & CEO, Wingify, the makers of Visual Website Optimizer

  • crivion

    there is also a a commercial plugin called “WP A/B Theme Conversion Testing” available at http://codecanyon.net/item/wp-ab-theme-conversion-testing/7228726 which helps you test your themes in terms of leads generation.

  • Hi Paras,

    thank you for reaching out. After your comment, I logged in again to see if I missed something.

    What I meant with comparing heatmaps is that you can’t use them in a overlay situation, where you can see in one instance what’s different. Please correct me if I am wrong, but it looks from your explanation and the dashboard that this isn’t possible.

    On the topic of them not being very advanced: I have to mouse-over each element myself to see if there are any clicks. The way CrazyEgg does this (with a colored +) is more convenient. I get what you are saying, but I still think it could be made more clear (since it is useful then indeed).

    I am not too big a fan of the consolidated heatmaps, since it doesn’t seem to work with my URL structure (no folders). I can only compare on main domain, or specific pages. There is no shared part in the URL (like “blog” in your example).

    And I personally don’t see any benefit in combining non-related pages in a A/B testing tool. Or in any tool, since there are too much different factors to base conclusions on. That’s why I do see benefit in using heatmaps for A/B testing, since that is were you exclude most of these factors.

    Putting the heatmap functionality aside, you still have a great product (and lots of people will still enjoy the heatmaps being included). And by looking at what you have coming for the new version, it will be even better. Especially if you can add in the mix and match filtering :-) I will try to give it a run after launching.

    Thank you again for reaching out, and I hope you take my “criticism” as positive as it is intended, since I am merely looking at it from a (be it a bit critical) customer point of view :-)


    • Yes, I’m always a fan of criticism! That’s how we improve.

      You’re right, you can’t (yet) overlay two heatmaps and compare what’s changed.

  • Agreed! I will keep an eye on it.

  • Hi, I can see its use when you have a new theme fully developed, and want to try it out (like with the SitePoint redesign), because you can set limitations. Just testing random themes won’t work I think, since each theme has it’s customization (and it would take a ton of time customizing multiple themes). And you can only performs tests on conversion, not on engagement for example. But I like the idea :-)

  • AG

    Thank you very much for this post! Does anyone have any experience with Maxymiser? http://www.maxymiser.com/us-home

  • Jude Aakjaer

    If you’re wanting to keep the A/B testing entirely onsite and are running a rails application, the split gem is a really great alternative as well https://github.com/andrew/split

    of course it means you need to be or have access to a dev to implement it but it also means you can play around a lot more with exotic options, filtering based on IP, and different, not always obvious success criteria.

  • In this list should be added Changeagain.me – the easiest tool for making A/B and split-testing.

    This service is using Google Experiments API, so it’s fully integrated with your Google Analytics account and you shouldn’t make any manual configuration. Due to this full integration all statistic data are gathered with maximum accuracy, so you will not be misled with the results of testing. Also you shouldn’t use many different services every day, because all the information will be one place.

    Also Changeagain.me has easy-to-use visual editor and ‘democratic’ pricing model. You shouldn’t pay for impressions of you test. You pay only for number of experiments and accounts.

  • Theresa

    Jacco, do you know of a HIPAA compliant heatmap software?