The Power of A/B Testing in Marketing

The Power of A/B Testing

If A/B testing is old news to you, and you already implement it across your digital media campaigns then great work! But keep reading because we are certain we can still offer you some useful tips.

If you have never encountered the term ‘A/B Testing’ before then let us introduce you to a whole new world of exciting possibilities.

A/B What?

So you have lots of digital media related to your institution and courses. Ad creatives, web pages, email templates and content. These are often adapted and evolved over time but how do you know if your new shiny web page design performs better than your existing one? How do you know if your AdWords text ad is performing as well as it possibly can? Is it possible to convert more of your website visitors to applications or enquiries? By now as you’ve probably guessed, A/B testing holds the answer.

A/B testing is pretty much exactly what the name implies. Pitting Version A against Version B to determine which is the best, kind of like boxing! Imagine you have recently redesigned the course pages on your website. An ‘extreme experiment’ would be to pit your existing course page design (A) against your new design (B). But how do you decide if it is better? The Answer: Goals.

Goals

In order to determine if Version A is better than Version B you must identify a clear goal to test for. Therefore it is vital that you decide what the goal of the creative or the content should be. For a course information page on your web site the goal may be to generate an enquiry, meaning Version A will go head to head with Version B to produce the most enquiries.

Once you have determined the goal you can continue to evolve the creative/content to further improve the conversion rate.

“What did you mean by an ‘extreme experiment’?”

A/B tests are often referred to as experiments, and earlier we stated that testing a completely new design against the old one was ‘extreme’. So why is it extreme?

One of the benefits of A/B testing is that it allows you to test incremental changes. For example you can change the wording or position of a call to action (CTA). The success (or failure) of this change can be determined from the A/B test results. The testing of small incremental changes provides you with empirical data on what works and what doesn’t. Ultimately A/B testing allows you to optimise for goal conversions, which means more enquiries and more applications.

Test results obtained from previous A/B tests should be used to inform any new design decisions. If you have no prior data from A/B tests and you are testing a completely new design against the existing one then you are not actually testing any one element of the page, but the page as a whole, which we consider an ‘extreme’ approach to A/B testing. If this is the case then you should test that your new version performs at least as well as the existing version, otherwise you could potentially roll out a version that reduces the rate of conversions significantly. But remember, continue to test and optimise, there is always room for further improvement.

A Real Example

At DEM we are always working to improve the conversion rate of our products. Higher conversion rates on our products equate to more quality traffic and leads for you.

We recently underwent an ‘extreme experiment’ of our own where we pitted a completely new subject directory design against our existing one. So, why did we do this and not incrementally improve the page by A/B testing smaller changes?

We identified that the previous design was starting to look a little tired, and the useful content that we offer our visitors was presented in a way that made it difficult to read, especially when considering the growing number of mobile users visiting our site. So we put our heads together and came up with a plan to start afresh with a new responsive design that would allow us to further optimise over the coming months.

As previously discussed, we didn’t want to launch a design that converted significantly worse than the previous design, so we began A/B testing. On just one of our subject directories (Engineering) we presented 50% of the visitors with the existing version, and 50% the new design.

Here’s what happened…

The Results

The results are in and the news is good. Our A/B test ran for a total of 10 days and included 1,281 visits. The results were close with the original version converting visits to a search at a rate of 36.89% and the new version 37.83%. This may look like a small improvement but when you consider the amount of traffic this page attracts in a year the number of additional searches is significant.

A/B Engineering Test

Not only that but the A/B test reassures us that we are not rolling out a design that converts worse than the previous version, and gives us a great responsive design to optimise through future A/B experiments to get that conversion rate even higher. Don’t forget, there is always room for further improvement.

Multi Variant Tests

Although this type of conversion optimisation is often referred to as A/B testing it is possible to test multiple variations against an existing one, this is known as multi variant testing. What this means is that if you were testing the position of a call to action you could test against 2, 3, 4 or even 5 differing positions in the same experiment.

How do I create an A/B test?

We use Google Analytics to perform our website experiments. This can be found in the Behaviour > Experiments section of Google Analytics. You can read more on Google’s Experiments documentation pages.

Typically you will create your variation on another url and inform your testing platform (Google Analytics in our case) of the original url and the url of the variation to test. Follow the instructions and you’re all set.

Remember that you need to allow the experiment a sufficient length of time to run. Google Analytics suggest a minimum of 2 weeks, however your experiment may require longer if you feel you will not receive a significant enough amount of traffic or impressions to produce a conclusive result.

What do I do with the Results?

When you have the results from your A/B tests you will be better informed of the performance of your new design, creative or content than ever before. Use this data to determine what does and doesn’t work. Share this data with your colleagues and preach the importance of A/B testing. It should now be apparent that continuous conversion optimisation will improve your ROI significantly.

How DEM can help you A/B test creatives

We like to help you get the most out of your activity with us. That’s why we offer the option of A/B testing to all of our clients.

If you have multiple versions of a text advert creative and want to determine which one results in  the most clicks then feel free to send us both and we will A/B test them for you. We can work with you to A/B test differing text ad titles and body content that you can then go on to use as part of your AdWords activity, likewise for image based ads. Just get in touch with your account manager.

Remember Our Top Tips

  • Have a well defined goal for your A/B test.
  • Test your call to actions (CTA). Wording, positioning, colours.
  • Test every stage of your conversion funnel.
  • Give sufficient time to the experiment.
  • Test incremental changes.

If you have any questions or would like to know more about A/B testing feel free to email us at contact@demltd.com.

Happy optimising!

– The DEM Team