Google Analytics Content Experiments - A Guide To Creating A/B Tests

Google Analytics Content Experiments - A Guide To Creating A/B Tests

[Last Updated on November 2013]

In this article I discuss Content Experiments, a tool that can be used to create A/B tests from inside Google Analytics. This tool has several advantages over the old Google Website Optimizer, especially if you are just starting the website testing journey. Content Experiments provide a quick way to test your main pages (landing pages, homepage, category pages) and it requires very few code implementations.

Here is a quick overview of the most prominent features that will help marketers get up and running with testing:

  1. Only the original page script will be necessary to run tests, the standard Google Analytics tracking code will be used to measure goals and variations.
  2. Website Goals define on Google Analytics can be used as the experiment goal, including AdSense revenue
  3. The Google Analytics segment builder can be used to segment results based on any segmentation criteria.
  4. Multi-armed Bandit approach: yields results faster than classical testing, at less cost, and with just as much statistical validity.
  5. Tests will automatically expire after 3 months to prevent leaving tests running if they are unlikely to have a statistically significant winner.

Below is a step-by-step guide on how to use Content Experiments to create A/B tests.

Creating Content Experiments

In order to create a new experiment, navigate to the Behavior section and click on the Experiments link on the sidebar. You will see a page that shows all your existing experiments. Above this table you will find a button Create experiment. You will then be asked the following information:

  1. Name for this experiment.
  2. Objective for this experiment: you can either choose an existing goal (including Ecommerce and AdSense metrics) or you can create a new goal.
  3. Percentage of traffic to experiment: the highest the percentage the quickest you will get significant results.
  4. Email notification for important changes: highly recommended!
  5. Distribute traffic evenly across all variations: if you turn this on, you will not get the benefits of Multi-armed Bandit approach mentioned above.
  6. Set a minimum time the experiment will run: defines a minimum period where Google Analytics will not declare a winner. If your website has significant seasonability or behavior patterns on weekends and weekdays (for example) this might be highly recommended; otherwise you might end up with a page optimized for only one of these segments.
  7. Set a confidence threshold: The higher the threshold, the more confident you can be in the result, but it also means the experiments will take longer to finish.

Once you define all the information above, click on it you will reach the following page.

Creating Content Experiment tests

In this page you can add all the URLs of your original page and the variations you would like to test. You will see thumbnails of the page, which helps you making sure the URLs are correct.

Click Next.

Setting Up The Content Experiment Code

In this section you simply have to choose to either implement by yourself the necessary code to run your test (in which case you should copy the code) or you will be given the option to send an email to whomever is implementing the code.

Click Next.

Validating And Confirming The Content Experiment Code

As mentioned above, you will need to implement one code in order to use this tool. In this step your pages will be verified. If the code is not found, you will see an error message.

Note that you will be able to skip validation if you want, just click on Start Experiment. If you do so, you will see a popup with the following message: "Experiment validation had errors or did not complete. Are you sure you want to start the experiment? If you are sure that your experiment is properly set up, you may continue." But it is recommended that you check the code to understand why you are getting an error and then try validating again.

Yay! You did it!

Content Experiments completed

Content Experiment Results

Once the Experiment is live, you will have the following options:

  • Conversion Rate: gives you the option to check the test results using alternative metrics.
  • Stop Experiment
  • Re-validate
  • Disable Variation
  • Segmentation: as mentioned above, this is an extremely valuable feature, it enables you to understand better how each variation performs for each segment of visitors on your website.

And below we see the results page of a test with a winning version, the Blue Sign up (green line, yeah, that's confusing :-), with a lift of 52% in conversions as compared to the original.

Content Experiments Winner

Reviewing All Experiments

Any time you want to review your experiments just visit

What are you waiting for, start testing!

Related Articles

Subscribe To Our Newsletter For Monthly Updates

Your e-mail will be kept private

Mark | June 2012

So no more element or multivariate testing?

Seems like another backwards move by Google.

Daniel Waisberg | June 2012

The message was that there will be no more multivariate testing for now. But I believe this is not final, time will say...

Dennis vd Heijden | June 2012

Hi Daniel, I think it is a great move not only because we launched and have MVT there I think it also helps the whole industry. This is now accessible and easier then it was... even though it is simple split testing its great for the masses... agencies and experts... well guess you have to look for a new tool.


Marian | June 2012

hi, I have one question. If I have dynamic websites. how do I use Content experiment?

Daniel Waisberg | July 2012

Marian, if you have dynamic pages I would recommend you go with this approach. It is a bit techy, but it looks robust.

Anonymous | July 2012

I am sorry Daniel, I have had my prime developer look at what you call a solution with virtual page views and its not clear...bottom line for now...not useful for dynamically generated sites..

Daniel Waisberg | July 2012

You are right, even though there are hacks that can work, Google Analytics Content Experiments is currently not a good solution for dynamic pages. I hope this will change...

Rick B. | May 2014

In case somebody finds this post 2 years later [ :-) ], Google Content Experiments DOES support dynamic content. You WILL need some kind of development expertise, though. See this link:

Ophir Prusak | June 2012

Hi Daniel,

Thanks for the write up (saved me some time :)

Can you elaborate on:
"Conversion Rate: gives you the option to check the test results using alternative metrics."
Does this mean that I can change the metric I want to use as my conversion goal dynamically?


Daniel Waisberg | July 2012

Ophir, the answer is no. You will not be able to change the goal. To be honest, you can't even define conversions that are not either a page or an event as of now. That is because the only conversion that can be used now is either a page goal or a event goa. What you can do is to analyze page conversions by segment of traffic (using Advanced Segments).

Thomas Harvigsen | July 2012

Great post, thanks..

I have found this new tool to be close to useless because of two big fails.

Test 1
Fail because my url attributes gets removed.
I have to use a URL that ends with ?opendocument, Content Optimizer removes this when adding the utm_expid... This was no problem in Gooogle Website Optimizer.

Test 2
Visits distributed very uneven

Page A 837 - Page B 1305 - Page C 252

Very strange why this is happening, with conversion rates at around 2-4 %. I will take a while before especially version C gets a chance to prove itself.

If anyone have any input to these two problems, help will be greatly appreciated :)

Daniel Waisberg | July 2012

Thomas, here is an answer to your first question from the good guys at Conversion Works:

Simple A/B with query string parameters
This experiment is slightly more fruity. Rather than redirect to a new page, I redirect to the same page with a decorated URI. I’m using javascript to handle the content changes (look at the menu – ‘How it Works’ vs. ‘Our Process’ – a very contrived test). This poses less SEO risk I guess but adds page complexity. This is almost like a single variation MVT. The content switch could be handled by server side code too and results in quite major test variation differences. Powerful but (again) limited in variations. Notice that the CE script has been modified to do the redirect conditionally to prevent infinite redirections:


Unfortunately, I have no good answers to your issue #2.

Thomas Harvigsen | July 2012

Thank you Daniel, It seems these guys are on to something. However I have just tried to implement it - unfortunenatly Content Experiements won't accept the modifed script. [It says: Possibly broken experiment code found here: line:5, column:1.We couldn't find valid experiment code, but we found something that looks like broken experiment code] Line 5 seems good and is the same as in the orginial script.

Content Experiments working or not - I am looking for a more advanced testing tool (A/B + Multivariate) Any recommendations ?

Daniel Waisberg | July 2012

Thomas, I would suggest you post this error on the roiginal post (the one I shared), maybe they can help you troubleshoot; or maybe they will just suggest you to skip validation...

As for other tools, I am currently using Visual Website Optimizer for Multivariate Testing. The pricing is fine and the tool seems robust.

Anonymous | July 2012

Does Content Experiments require asynchronous GATC?

Daniel Waisberg | July 2012


Mike Zipursky | July 2012

Having switched over from GWO to Content Experiments within Analytics we're finding the data is lacking accuracy.

Running an A/B test where visitors are purchasing a product. The conversion is when the user lands on the thank you page to download their purchase.

Content Experiments seems to be tracking not unique visitors to this page, but rather each time a user visits the thank you page (if there is some time between the visits).

This means that the data is (sometimes) showing 2 conversions even when only 1 occurred. This happens when a user goes to the thank you page after purchase. Leaves the page and then comes back to it hours or even a day later to download the product they purchased.

Before with GWO, only 1 conversion would be tracked. But now, sometimes 2 are showing = the whole experiment isn't accurate with Content Experiments.

Has anyone else had this issue?

Any solutions or suggestions?


Rishad | December 2012

Hi Mike,

I am an Amature in Web Analytics and just launched my first Content Experiment (tha's how I am reading this post).

Coming back to your problem, I think the issue is with what you are tracking. Instead of a URL Goal to the thank you page, in order to get more valid and reliable data, you might have to set up even tracking for the download button. that way, no matter how many time the customer visit the page, only one conversion will be recorded when he clicks "download button"

And, by the by, Great post. Very useful.

Marius Pop | September 2012

Hello there,

I'm having trouble with my goals that are not being tracked anymore since launching the experiment. Everything seems to be fine in terms of the code setup - but since i launched the test, it simply doesn't track the goal i selected as a conversion.

Has anyone had the same issue?
Suggestion would be highly appreciated:)

Daniel Waisberg | September 2012

Hi Marius. Are there multiple subdomains on your website? Things linke and That might be one of the causes...

Marius Pop | September 2012

Yes, we do have multiple subdomains, but we currently only use one two of them.
Any suggestions starting from here?

Daniel Waisberg | September 2012

You will need to update your Content Experiments code on the original page. Here is an explanation from the Google Analytics help center: Run an Experiment Across Subdomains.

Maria | September 2012

Tried to setup a test for two different websites already. I got this error each time:

Experiment code missing the cookie domain name declared in tracking code:line:297, column:14. Your page customizes the cookie domain name in the Google Analytics tracking code. The same customization should be present in the experiment code.

Any ideas why the reffered customization doesn't appear in the initial code?

Daniel Waisberg | September 2012

Maria, it looks like you have the same issue as Marius Pop, an issue with tracking multiple subdomains on Google Analytics. Please check the link above in my reply to Marius. Let me know if it does not help.

Maria | September 2012

Partly yes. On one of the websites there are no subdomains. We managed to start the experiments but still no conversions shown.

On the site with subdomains we started the experiment but only a tiny fraction of the conversions are being tracked. Anymore ideas?

Hamish Blackall | September 2012

Maria, Did you ever get it to work? I use many subdomains and dont know which one is coming in the referring url. I set the root level domain in the tag and set the __udn before the experiment code as Daniel suggested.
Also, because the ecommerce happens on a different root domain I have the root domain tags set with _setDomain none and setAllowLinker true and have links between root domains with the appropriate onclick events to attach the cookie data.
With the reporting delays its so hard to tell if anything works but it doesnt seem like it. Pity visual website optimizer is so expensive for heavy traffic.

Anonymous | September 2012

I was running my experiments through CE. I have doubts on a few things.
a) why is there a 2 week minimum (recommended) time before we see a clear winner?
b) isn't winner, the variation that has the highest probability 'converted' > than that of original version? isn't that just a simple frequency rule? i.e. if 4 out of 5 conversions are higher than original, then we choose that variation? It's not such a magic metric. Any one with business acumen can decide the winner. again, it has the flaw, because it doesn't give weights to most recent conversions.
c) how does CE sample the visitors to divert to each variations of the page?


chris | September 2012

I'm stumped. I'm stuck at Step 4: Review experiment where it shows the two thumbnails. The variation shows correctly but the ORIGINAL is blank. The link below the image works to take me to the original page. These are not pages that require logging in. And the previous code validation worked correctly. I've refreshed and gone back and forth between experiment steps. I'm stumped.

Angela | October 2012

Hi Daniel, thanks for the great post. I'm trying to get this working on a wordpress site, but it keeps giving me a validation error for the original page:

Experiment code found after the Google Analytics tracking code:line:36, column:69, line:91, column:1.
Make sure the experiment code is immediately after the opening tag, so that it is before the Google Analytics tracking code.

I don't know if it's finding the re-marketing code and calling it broken? Any suggestions?

Lenin | October 2012

Hi Daniel, I'm with the rest, this is a great article and thank you for your input about content experiment. This is the first time I'm running an experiment and I'm not sure if I'm doing something wrong or I just have to wait, but in my case I turn experiment on with my original variation and two variation on sub domain (I do have my analytics code looking to sub domain sites), however the first day I can see everything looks good with rotation and visits flowing to all variation, but next day I don't see rotation, experiment seems to work but the majority of my traffic is going to the original variation. I'm going crazy trying to figure out what went south. I can say that 90% goes to my original variation and 5% to each of the other two variation. I select to have all 100% of traffic included in the experiment and I can'f find out a solution or an explanation of what could be happening. Do you or anyone have any idea of what might be going on?
Thank you in advance and like I said great article.

Dan | November 2012

No matter what I seem to do I can't get Google to validate my code. What's strange is that it says that there's code (it validates code that is) on the variation page, though I placed no code there. For me this has been a complete and total bust and have not been able to get any experiments to run. I'll submitting a support ticket via my adwords account but this has been a huge step backward.

Mr Paul | November 2012

wonderful guidelines Dan, but i have got a little issue, This is the first time I'm running an experiment and I'm not sure if I'm doing something wrong or on my first day, the rotation was literally okay, but on the second day, it just wasn't rotating the next day, thanks for your reply.

Sandeep | December 2012

Hi Daniel, how can I add content experiment code to my google sites. Google suggests to put the experiment code immediately after the but google doesn't allow to edit part of the site.

Heather | January 2013

Do you know if there is a way to make the minimum experiment time longer than 2 weeks? I have a client who would like the experiment to run 3 months.

Kathy | January 2013


Very nice article thanks for sharing.

I want to know, is there any way to do this experiment for large dynamic sites?

Like I have, I want to experiment article page.

What I want is /article_1.html should redirect to /new/article_1.html

and similarly /article_2.html should redirect to /new/article_2.html

and so on.

Is there anyway to do this?

Thanks in advance.

Jennifer | March 2014

I would like to know the answer to that to!! But in my case a little different:
for example /article12.htm to ... /article12.asp

Anyone knows the answer?

Kathy J. Lowrey | January 2013


Very nice article. It helped me a lot understanding content experiments.

But I have a question.

My goal type is currently URL Destination.
And we push conversions manually to analytics after phone call matching ususally it takes one day.

We use this code for pushing conversions.

var _gaq = _gaq || [];
_gaq.push(['_setAccount', 'XX-XXXXXXXX-X']);
_gaq.push(['_setAllowAnchor', true]);
_gaq.push(['_trackPageview','/call/conv ']);

What I want to know is how do I push a conversion manually for a specific variation.

Thanks in advance for help.

Josh Fialkoff | February 2013

Nice post! Do you have any recommendations on techniques for determining which pages (and page elements) to test?

Dancho | July 2013

I am bit new in A/B testing so I have a question regarding how the conversions are calculated in Content Experiments. I completed a an A/B test where I had 2 version of a page (original and challenger)

Let’s say the report shows I have 10 000 visits and 100 bookings. Does this mean that the conversions were made in the same session as the page in the experiment was visited?

For instance Visitor A visits the XXX experiment page, then goes back to the homepage, searches for trip in the booking guide and converts = one conversion registered.
Visitor B visits the XXX experiment page, then close the browser, visit the site again a couple of hours later (but does not visit the XXX experiment page),enters the booking guide and converts = no conversion registered

Is this correct?

presta | July 2013

thanks for the detail post, can i have 4 variation page running at the same time? also the default experiment time says 2 weeks, is it ok to run for 3 days for a site that has low number of visitors per day?

Daniel Waisberg | August 2013

Presta, you can have 4 variations running at the same time, but you will have to wait for two weeks to start getting your results. Basically the more often that people visit a page or complete a goal, the less time it takes to gather data, but two weeks is the minimum.

Ferris | August 2013

I have one question regarding the variation page in terms of SEO. Apparently you don't want your variation page to be indexed, right??
So you also did the NOIDEX, NOFOLLOW for the varation page?

Thank you in advance!

Daniel Waisberg | August 2013

Thank you Ferris. Here is the official response to that, I hope it helps:

Sandeep | September 2013

I am currently in process of how content experiment works. I am running a sample test for a sharepoint web application with one variation and having destination as a goal.
In sharepoint pages, there are web parts. In web parts, there is no tag, I am wondering where shall I put the experiment codes. Can anyone help me in this please?

Anonymous | September 2013

Is it possible to use GA Experiments if the website is using GTM container and GA tag is defined in the container. The concern is that GTM tags are loaded asynchronously and including A/B testing tags in GTM will render undesired user experience and undesired GA tracking.
Is there a work around for this? How can one leverage the GA Experiments options while still using GTM container on the pages for tagging purposes. Would creating a rule in GTM to exclude a certain page (page we want to test) from firing the GTM code and instead include the GA code inline on the control page work and if so is this the only option at this point?

Dendy | September 2013

We have created 2 (two) different homepage for our e-commerce site.
the original is while the variation 1 is , on this variation 1 , we create some different link with different banner.

But when the A/B Testing is running and our customer click all link in variation 1 ( then all link will be redirect to the original homepage (

All the links in variation 1 are correct. I don't why it happens

Adriano | November 2013

Hi, Daniel

You mention :
Tests results will not appear for at least 2 weeks, a mechanism to encourage statistical significance.

Does this mean that I will see no conversions on the experiments ???
Cause I see conversions on the goals ( when look at the goal itself ) , but I don’t see it on the experiment ( that using the goal )
I read in te foogle forum that a lot of people are having the same issue

Pedro Pereira | November 2013

Awesome post - This is exactly what we need at the moment!


jakob | December 2013

Hi Daniel,

thanks for the post ti was a great help!
I'm quite new to GA and was wondering if there is any way to create content experiments on on one segment of the users - for example - only US traffic or only new users and so forth...

Thanks a lot!

Minyak Ikan | January 2014

Hi Dan, thanks for posting this article. I had no idea that GA can be such useful for testing. I'd like to implementing this knowledge. Thankyou

Jake | January 2014

Dan, thanks for the article. 2 questions....I have 2 funnels I want to test against each other. One has GA fully implemented, the other does not. these are 2 different store sites selling the same products, so we want the final order to be the primary KPI of the test. But this means we need the same step (order submit/confirmation page load) to be represented as that Goal - but on 2 different pages (because each of the store's order confirmation page is different because they are 2 different store sites).

Any suggestions on how to pull this off with CE on GA? Can we use regular expressions to check if the same string exists on the order pages? Can you have 2 different events be rolled up into a single goal?


Barak D | March 2014

I wish to test a few landing page but the traffic arrives from different campaigns, and each one of them gets ?cid=### with a unique campaign id (did). Will the experiment heart my measurement and the cid will be lost?

Nishadha | December 2014

I'm also running a test in GA. However the problem is total goal conversions ( shown in the goals ) and the goal conversions in the experiment is way off. This isn't an issue with the number of session because total number of sessions in the experiment ( eg: version 1 - 750, version 2 -730 ) is very similar to the starting step of the goal ( 1500 ). Is there a valid reason for this ?

Post new comment
The content of this field is kept private and will not be shown publicly.
Refresh Type the characters you see in this picture.
Type the characters you see in the picture; if you can't read them, submit the form and a new image will be generated. Not case sensitive.  Switch to audio verification.
Online Behavior © 2012