Home Resources Blog

Admit it – when timelines get short, or budgets are tight, often the first thing that goes out the window is the process of testing email campaigns. This time, we’re not talking about the repetitive testing that goes into making a design look consistent across Hotmail and Gmail, we’re talking about optimizing the results from your sends, such as maximizing your open rate with a proven subject line, or selecting a solid call to action.

After reading Mark Brownlow’s insightful post on what parts of an email should be tested, we thought it was time to look at 3 simple tests and how you can get them to work for you in Campaign Monitor.

Test #1: Picking an Effective Subject Line

Subject lines are by far the most popular element of an email design to test, for two reasons: a) designers are an indecisive lot, and b) it’s so very easy to do. But perhaps more importantly, the subject line has a big part in determining whether the email gets opened or not. And with more opens, you consequently get more clicks, too.

For example, in last month’s newsletter, we had two subject lines in mind, being:

Subject lines

By creating an A/B split campaign, it’s possible to test both subject lines by sending two versions of the email to a small portion of subscribers. Once the test is complete, the most popular version is then automatically sent to the majority of subscribers. That said, can you guess which of our subject lines won?

A/B test

According to the estimate in our A/B testing reports, Version A resulted in an additional 1,640 opens, or a 6% increase over Version B. A modest improvement, but not bad for 30 seconds of work. But perhaps most importantly, we can look at this in alongside previous test results, or use it as a starting point for future tests to determine what kinds of subject lines are most effective.

Test #2: A Cool Call to Action

“So a client *almost* didn’t want to run an A/B split test… But they did and will get 400%+ in clickthroughs. Yay!”
@lonnietapia

Although a little more legwork than running an A/B split on a subject line, testing email content such as a call to action or button design is a swell way to generate more clicks and maybe even learn a bit about what elements work best in your email newsletter. Plus, the results can be compelling. As mentioned in Mark’s post, ‘I recommend (that) new email marketers test creative elements like adding call to action preheaders, call to action buttons and language, as these often provide big, immediate wins.’

As we’re going to stick with the easy stuff in post, lets look at an example from earlier this year – an A/B split campaign that used two different calls to action (CTAs). Can you guess which one won?

CTAs - Version A & B

Both seemed like strong candidates, but the test results really set them apart:
A/B results

The reports estimated that the Version A received an estimated 51% increase in clicks to the CTA as a result of being sent. That’s the difference a line can make!

Test #3: Which Image Wins?

Another key suggestion from the post was to, “…test (product) images: these can produce a big lift and are easy to do.” Like the above A/B split campaign, this also requires two versions of your email content, but as our friend, Anna Yeaman at Style Campaign observed from her A/B tests on animated vs. non-animated email newsletters, the results can be dramatic.

In her example, she created two versions of the same email, but one included an animated gif, while the other displayed a static frame instead:

Version A

Version A

Version B

Version B

 

After running a A/B split on email content, the winner (Version B) was sent and consequently racked up an estimated 26% increase in click-throughs – all thanks to the small animation above.

We’ve Only Just Begun…

So, we’ve featured some fairly simple examples here, but as the folks at Performable point out, A/B testing doesn’t necessarily have to be about small changes – you can also use A/B split tests to determine which email layouts work best, how tone of copy affects response and more. By testing two very different email designs, it is possible to, for instance, determine if a redesigned email newsletter is on the right track as a whole… But of course, this is quite different from determining which single CTA works best!

Also, it’s a good idea to not look at results from your tests in silo, but set controls, look for trends and take into account the bigger picture. For example, variables such as the time of day or time of year can affect campaign results – especially now, during the holiday period – so learnings from June’s campaigns may not be as applicable when sending in December.

Finally, as we’ve seen above, testing doesn’t have to be a time-consuming process and ultimately, should be seen as an ongoing, recordable feature of your campaigns. Even if you’re simply making small changes to your subject lines, at least you’re making improvements… And learning a little something more each time you send.

So now, it’s over to you. We’d love to hear about what tests that you regularly perform on your campaigns – best, worst, or weirdest – or what you’re planning to try, so fill us in via the comments below.

  • Chris

    You conclude from the first test that the better open rate is due to a more detailed subject line. Why is the reason not that your readers love reading about REST services? Or that the recipients like longer subjects? Seems like an arbitrary conclusion to draw and is a good example of how while A/B tests can be powerful tools, you need to be careful about jumping to conclusions about what the real difference between them is.

  • Jarrod Taylor

    Hi Chris, you’re correct that it’s unwise to look at results in isolation and jump to conclusions so soon. I should have really stated that the result should be viewed in the context of similar tests, or used as a starting point (as touched on later in the post), so I’ll give it a tweak now. Thanks for your feedback! :)

  • Eric Overklift Vaupel Kleijn

    Very nice indeed. Testing is still key for improving your results. Every case is different so you can only find out by testing this stuff.

  • Julia

    Any chance we can get a whitelabel version of this to send to clients? Or are you happy for us to use this content and not reference back to Campaign Monitor?

    If not, totally cool – but I just thought I’d ask :)

  • Ros Hodgekiss

    Hi Julia, by all means use the content for your clients – we’d be honoured :) Have an awesome holiday season ahead!

  • Lonnie Tapia

    Thanks for including my quote from Twitter. We’ve seen higher click-through rates even when it’s just a subject line A/B split test.

    Sometimes the payoff is given away in the subject line so people don’t want to click to “learn more.” You want to entice first, the pay off. Not the other way around.

    Thanks again.

  • Paul Stokes

    Hi Ros, great tips. It seems I’m so busy running in circles I forget the payoff of getting better open rates and actions. My goal for 2011 is to spend an extra hours on each campaign to do A/B testing — as your blog shows, the results are worth the effort.

    Best to you in 2011. Happy Sailing … _/) Paul

  • Nicole

    Emails don’t get opened immediately, so how much time do you guys set aside between the A/B testing (of subject line) and determining which works better, thus sending to the full database?

  • David Greiner

    Nicole, we let you set that time yourself when setting up the test. it might be one hour or three weeks, it’s completely up to you. You can also make the call after the test has been running for a little while and choose the winner yourself based on the results so far.

This blog provides general information and discussion about email marketing and related subjects. The content provided in this blog ("Content”), should not be construed as and is not intended to constitute financial, legal or tax advice. You should seek the advice of professionals prior to acting upon any information contained in the Content. All Content is provided strictly “as is” and we make no warranty or representation of any kind regarding the Content.
Straight to your inbox

Get the best email and digital marketing content delivered.

Join 250,000 in-the-know marketers and get the latest marketing tips, tactics, and news right in your inbox.

Subscribe

See why 200,000 companies worldwide love Campaign Monitor.

From Australia to Zimbabwe, and everywhere in between, companies count on Campaign Monitor for email campaigns that boost the bottom line.

Get started for free