Do you A/B test your email campaigns?

If so, you’ve probably had some good gains in the past. We certainly have.

But you’ll also know that not every test you run produces a big increase in opens or click-throughs.

And that’s ok, because even if there isn’t a significant increase in conversions you still learned something that will make your email marketing better in the future.

In this post, we’ll share with you the story of how we tested taking a more personal approach in our email campaigns and got a 0% increase in conversions, and why we still considered it a win.

The inspiration and hypothesis for the test

Whenever we write and publish a post on the Campaign Monitor blog, we send an email to our blog subscribers letting them know a new post is published.

The email looks a bit like this:


Over time we’ve done a lot of A/B testing on this email (including the template, copy, buttons, and more), so it’s well-optimized for conversions.

However, recently we’ve noticed a trend amongst other bloggers and content marketers towards taking a more personal approach to their blog notification emails.

This email campaign from Groove is a great example:


While we send a summary of the post with a big call to action button, Groove CEO Alex Turnbull sends a personal email to his subscribers telling them a new post is out and he’d love them to check it out.

Groove’s email contains very little information about the post and instead focuses on creating a more personal dialogue between Alex and the subscribers.

This got us thinking, were we taking the wrong approach to our email? Were we missing out on a significant amount of conversions by continuing to use the ‘summary’ approach to our blog notification email instead of the more personal approach?

See the thing is, even when you’re doing a lot of A/B testing and optimizing of your campaigns, it’s easy to hit a local maximum.


If you’re continually making small changes to your campaigns (like button copy, colors, etc) but still using the same template and methodology, then you’ll like reach a point where you’ve gotten the best results possible from that approach (the local maximum).

But there is always a possibility that by taking a radically different approach, you can find yourself on another path that could lead to results much greater than the previous one (the true maximum).

Afraid that we might have optimized ourselves to a local maximum, we decided to test the personal approach against our summary approach to see if it would help us find a new maximum.

The two emails we tested

After publishing our recent post on the 5 elements of an effective marketing offer email, we created two different versions of the notification email to our blog subscribers.

The summary approach

For version A of the campaign, we crafted the email in the same way we always do. We used the title of the post as the main heading, included a summary of the post as the body copy and added a benefit-focused call to action button that links through to the post.

It looked a bit like this:


The personal approach

For the B version of the campaign, we adopted Groove’s method of writing emails and crafted a personal email to our subscribers. We spoke to them in a friendly tone, telling them about the new post we’d just published and asking them to click-through and have a read.

It looked a bit like this:


The result

As mentioned earlier, the two different versions of the campaign performed almost identically and there was no noticeable uplift from using the personal approach than our normal summary approach.

Why we still considered this test a win

Given that there was no increase in click-throughs from this test, you’re probably wondering why we’re taking the time to tell you about it.

The thing with A/B testing is that not every test you run will have a clear winner. Sometimes the version you were testing will lose, and other times there will be no change whatsoever.

But what you do get from it is insight into what works for your unique audience.

We learned that even though the personal approach works for Groove (and many others), our audience is completely different to theirs and obviously responds in a different way.

And with the knowledge that the personal approach doesn’t work for our audience, we can continue using our existing summary approach to our blog notification emails knowing that we haven’t reached a local maximum and are getting the best results possible from our campaigns.

And for us, gaining that knowledge is a win.

In conclusion

Even though we didn’t get a noticeable increase in conversions from this A/B test, we learned that the summary approach to our blog notifications email isn’t necessarily the wrong approach, and that knowledge helps us run more educated, more successful A/B tests in the future.

So when running A/B tests on your email campaigns in the future, don’t be afraid to occasionally test something a bit more radical than just your subject lines or button copy.

By testing completely different templates or approaches, you could not only get an instant increase in conversions, but find yourself on a new path that leads to a sustained increase in conversions over time.

Your turn: What are some of the different A/B tests that you’ve run on your campaigns? And what insights have they given you about your audience? Share your story with our readers in the comments below!

Want to improve your email marketing?

Join over 20,000 other marketers & designers who get tips on improving their email marketing delivered directly to their inbox.

  • Thomas

    I have ran a huge amount of A/B tests this year. Here a few tests I have ran, long vs short email, Gif vs no Gif, From email address and my most recent one was product descriptions vs product reviews. I love split testing as you can never learn everything about your audience no matter how many tests you run, there’s always something more to learn. I suppose that’s why I love doing what I do :)

  • Eyal B

    I’ve ran two significant A/B tests: 1. An email with or without a preheader text. 2. Responsive Vs. Non responsive Email.

    1. had amazingly 0% uplift. My later suggestion was to put a preheader text that makes the reader take an action rather then summarizing the email as a second subject line.

    2. That of-course was the big feather in the hat kind of email. The results of opening were obviously the same but the click rate was 40% more on mobile.


  • Aaron Beashel

    Hey Eyal

    Thanks so much for sharing your story.

    That’s really interesting about the preheader A/B test. I agree it’s easy just to summarise the email, but including a call to action to get them to open the email is great advice.

    Keep up the good work with the testing!

  • Fred Testard

    Thanks a lot for sharing !
    What’s great with test is that gain or lose… you still learn something, well do you ?
    One question that I always have for marketers running tests is:
    Ok, you’ve noticed an uplift on Opens or Clicks… but from whom ?
    Did you get more opens from your prospects or from your customer ?
    And in the process did you get more or less engagement from your high value customers ?
    Test is a double sided weapon:
    – Very powerful when used properly and definitely a quick win
    – Very dangerous if you draw the wrong conclusion from it

  • Fio

    Can I ask why you did not read the results the other way around?

    If the personalised approach worked just as well as the non-personalised one, arguably you could have tried to optimise the personalised email further to see if you would discover a ‘truer’ maximum in a follow-up test. In fact, I don’t think you can infer that “the personalised approach doesn’t work” for your audience from this result, and I wonder how/why you made this assumption.

  • Aaron Beashel

    Hey Fio

    It’s a great point that you make in that we could have tried to optimize the personal approach email to find a higher maximum.

    The reason we didn’t really comes down to our brand. When people subscribe to the blog, they subscribe to hear from Campaign Monitor rather than ‘Aaron from Campaign Monitor’, so we’d actually rather take the summary approach than the personal one.

    The purpose of the test was to see whether we were missing *significant* click-throughs by not taking the personal approach. Given that we found we weren’t, we’ll continue using the summary approach as we feel thats the best method for our wider brand and marketing goals.

    Hope that answers your insightful question.


  • vladmalik

    Are you still A/B testing? We’d love to see what you’ve tested. Would you wish to share anything you’ve been testing recently on The more tests people log, the more we can start to see patterns among different tests. Please consider sharing anything that lost or had no effect too. We don’t just want winning tests.

Want to improve your email marketing? Subscribe to get tips on improving your email marketing delivered to your inbox.

Join 200,000 companies around the world that use Campaign Monitor to run email marketing campaigns that deliver results for their business.

Get started for free