If you’ve been hanging around the 60 Second Marketer for long, you know we’re big fans of testing our way to success. By that, I mean that we’re big fans of running A/B split tests so that we can track the results of our campaigns and make improvements to them and generate better results.

Let’s kick this post off with a definition of an A/B split test so that we’re all on the same page. This one is offered up by our friends at Visual Website Optimizer:

A/B testing (sometimes called split testing) is comparing two versions of a web page to see which one performs better. You compare two web pages by showing the two variants (let’s call them A and B) to similar visitors at the same time. The one that gives a better conversion rate, wins!

Screen Shot 2015-12-07 at 9.29.23 PM

The beauty of digital marketing is that everything you do can be tracked, measured and optimized.

With that in mind, we decided to run an A/B split test on the exit message that pops up as new guests are leaving the site.

We wanted to test whether Version A performed better than Version B.

Here’s Version A:

Screen Shot 2015-11-12 at 9.47.49 AM

You’ll notice that Version A was pretty straightforward — the offer was an e-book that I wrote that included descriptions of and links to 101 top digital tools that I’ve analyzed and reviewed over time.

(If you’re interested in receiving your own copy of the e-book, click here and we’ll email you a copy.)

As much as we liked Version A, we felt as though we had a better offer — one that highlighted all of the benefits of being a member of the 60 Second Marketer community.

This offer included things like a free chapter from one of my books, a mobile marketing research report, the 101 top digital tools e-book, as well as an invitation to join our MasterMind group.

With all that in mind, we came up with a second version of our pop-up message.

Here’s Version B:

Screen Shot 2015-12-07 at 9.21.04 PM

As you can see, Version B not only offered more bells and whistles, but had a more colorful, engaging design.

Take a look at both versions above and formulate an opinion about which one you think would perform better.

By looking at the options above and formulating your own opinion, you’re doing what many marketers do when they create campaigns. That is, you’re making a judgement based on one data point — your opinion.

In some cases, you have to make a decision based just on your opinion, but for clients of 60 Second Communications, we encourage a more quantitative and scientific approach to marketing, so we run tests whenever possible.

So, now that you’ve had a moment to review the two versions of the pop-up above, which one do you think won the test?

I don’t know about you, but I thought that Version B would be the winner. After all, Version B provides four different benefits — the chapter, the research report, the e-book and the MasterMind group.

That’s four — count ’em four — different benefits versus just one benefit when you click on Version A.

Can you tell where I’m going with this?

I figured people were going to click the pop-up with four benefits more often than they’re going to click the pop-up with just one benefit, right?

Four is better than one, you know what I mean?

To my great surprise, I was completely wrong about which version of the pop-up would win the test.

Instead of people clicking through on the offer with four benefits, they clicked through on the offer with just one benefit — the 101 digital tools e-book.

Huh?

Here’s how the data played out.

Screen Shot 2015-12-07 at 9.56.00 PM

When you run an A/B split test, your goal is to run each version of the marketing message about 50% of the time. You can see in the data below that Version A ran 1,266 times and Version B ran 1,305 times — that’s basically a 50/50 split, which is good enough for our purposes.

Screen Shot 2015-12-07 at 10.00.12 PM

To my great surprise, Version A outperformed Version B by more than 2 to 1!

Screen Shot 2015-12-07 at 10.09.28 PM

It took me a long time to realize how important it is to make marketing decisions based on science rather than on guesswork. With the advent of digital marketing, testing your way to success is easier than ever.

Action Steps for You.

Here are some action steps you can take coming out of the learning experience we just had.

  1. Don’t Trust Your Assumptions: As tempting as it is to trust your gut instincts, put those instincts aside and test your campaigns. Quantitative research beats gut instincts every time.
  2. Use A/B Testing Software: We use Icegram for our pop-ups, which has A/B split testing software built in. I also hear good things about Visual Website Optimizer which is software you can use to test different versions of landing pages on your site.
  3. Keep Optimizing: You’ll be amazed at how much you can keep improving your campaigns by continuously optimizing them. Don’t stop at one test — keep testing for as long as you have the capabilities.

That’s all for now. I hope this was helpful. We certainly learned a lot running this test — hopefully, you did, too.

About the Author: Jamie Turner is the CEO of the 60 Second Marketer and 60 Second Communications, a marketing optimization firm that helps businesses improve the impact of their marketing by 10% or more. He is the co-author of “How to Make Money with Social Media” and “Go Mobile” and is a popular marketing speaker at events, trade shows and corporations around the globe.

101 Digtial E-book 2.001;