Many reasonable folks out there spend a lot of time and frustration in the search for the ‘perfect’ call-to-action. The good news is that with A/B testing, you can chose two suitable one-liners, then let the crowd decide which is their favorite!

If you’re curious about the difference one line can make when it comes to the responsiveness of your emails, here’s a quick example. We launched a survey amongst a group of active customers, to help us prioritize improvements to our app this year. As we were keen to receive as many responses as possible, we ran an A/B test using two versions of our survey email. Each email contained a different text-link to the survey, so we determined the winner based on which link received the most clicks.

Here are the two different emails we ran:

Version A

{title}

Version B

{title}

Difference seems fairly subtle, right? Not until you see the results:

{title}

The winner (Version A) achieved an estimated 51% increase in clicks over Version B! This equated to around 2,230 more customers clicking through to our short survey. We certainly didn’t expect that dramatic a lift, however it does go to show that a simple call-to-action (“Tell us what we can do better”) is often, the most effective. So, if you’ve got two strong CTAs in mind, why not give both a shot and let your readers choose. You may be surprised by the difference one line can make.

Do you have an A/B testing success story, or have you learnt something curious from your results? Let us know in the comments below!

  • Filip Salomonsson

    How do you calculate these estimations?

    What motivates an estimated 30% increase in opens if the only difference was in the body of the email? And how can 84/61 clicks (with a statistical significance of less than 95%, by my calculations) lead to an estimated 51% increase?

    I am definitely, definitely not a statistician, so I may be way off here. But the estimations don’t look right. Can you clarify?

    Statistics is hard – let’s go emailing!

  • Rex Dixon

    Would be great if you could come by our A/B Tests web site and share/upload this data and information.

  • Bart

    A little off-topic… The results are what I would expect. The first question limits to feedback on CM’s features while the latter question is very broad and requires deeper thinking from the customer. Suppose someone enters a restaurant and the waiter asks ‘What would you like to eat” versus “Tell us what you would like from the menu.” People would have a harder time coming up with ideas for meals themselves than picking something from a list.

  • Manfred

    Based on the total click rate, version A had a conversion rate of ~86%, and version B had a conversion rate of ~44%. By my measures, this results in 99.9% statistical confidence. i.e. Pretty good!

  • Tim Watson

    Testing is vital and thanks to CampaignMonitor for publishing this result.

    The result is statistically significant to 90%. From the figures reported the test was conducted over 2604 people in each A and B cell.

    This makes A click rate 3.2% and B click 2.3%. The difference over sample size of 2604 gives 90% statistical significance.

    That’s a 37.7% improvement. The article puts click improvement at 51% – I read this as being the improvement over the full list size with A being sent to the remaining list. Compared with if the split test had been over the full list and not just the initial 2604 test cells.

    The testing tells the what but not the why. If we want to learn from testing and be able to work out a better subject line next time too, then we need to work out why one did better than the other. Bart has hit this one in his comment. The ‘give us your ideas’ is asking for much more work from the customer, implying some original thought is needed. Whereas ‘what can we do better’ is asking someone to reflect on their experiences and things they would like to see improved.

    What would be interesting is the answers from the two different groups. The value of the answers to the question ‘give us your ideas’ may in fact be higher than ‘what can we do better’. Less could be more. Ultimately in testing you need to test against your ultimate objective, which won’t be clicks. Agreed that often clicks is the closest measurable item.

    There is a good whitepaper on testing here http://www.smartfocus.com/LinkClick.aspx?fileticket=0L7hgxyGQrA=&tabid=178&language=en-GB

  • Filip Salomonsson

    Tim: ah, if the estimations are based on the full list that definitely makes much more sense (though it would be sensible for CM to make the UI more clear about the uncertainty of those numbers). Thanks.

  • Eric

    Awesome, thanks for sharing the results!

  • Amanda

    I’m trying to figure out the stats as well. Is there a difference between clicks and specific clicks, and are the 84 and 61 figures percentages?
    Thank you.

  • Ros Hodgekiss

    Hi Amanda, specific clicks are clicks to the CTA only (ie. ‘Tell us what we can do better’). This is separate from total clicks (as there were 2 links in this campaign). This isn’t a percentage figure.

    Wow, there’s been a big ol’ response to this post – the good news is that over the next few weeks, we’ll be adding to the blog how we determine our A/B test statistics, as we did the bad of not doing so this time around. So everyone hang tight – we’re really sorry that we created a bit of confusion by simply throwing the ‘magic numbers’ out there, but we’ll be addressing all your questions in coming posts. Thanks, folks!

  • Nathaniel Stott

    An interesting comparison and test would be a text only mailing.

  • Scott Petrovic

    I love these A/B comparison posts. There might be underlying variables that aren’t accounted for like some people mention – but I still find this helpful. Good job!

  • Rex Dixon

    Anyway you can share some of this data with A/B Tests so people can see the importance of testing?

  • fabiano

    I really like this comparison…

  • Alex

    Very good!

Want to improve your email marketing? Subscribe to get tips on improving your email marketing delivered to your inbox.
X

Join 200,000 companies around the world that use Campaign Monitor to run email marketing campaigns that deliver results for their business.

Get started for free