Browse by...
Home Resources Blog

Admit it – when timelines get short, or budgets are tight, often the first thing that goes out the window is the process of testing email campaigns. This time, we’re not talking about the repetitive testing that goes into making a design look consistent across Hotmail and Gmail, we’re talking about optimizing the results from your sends, such as maximizing your open rate with a proven subject line, or selecting a solid call to action.

After reading Mark Brownlow’s insightful post on what parts of an email should be tested, we thought it was time to look at 3 simple tests and how you can get them to work for you in Campaign Monitor.

Test #1: Picking an Effective Subject Line

Subject lines are by far the most popular element of an email design to test, for two reasons: a) designers are an indecisive lot, and b) it’s so very easy to do. But perhaps more importantly, the subject line has a big part in determining whether the email gets opened or not. And with more opens, you consequently get more clicks, too.

For example, in last month’s newsletter, we had two subject lines in mind, being:

Subject lines

By creating an A/B split campaign, it’s possible to test both subject lines by sending two versions of the email to a small portion of subscribers. Once the test is complete, the most popular version is then automatically sent to the majority of subscribers. That said, can you guess which of our subject lines won?

A/B test

According to the estimate in our A/B testing reports, Version A resulted in an additional 1,640 opens, or a 6% increase over Version B. A modest improvement, but not bad for 30 seconds of work. But perhaps most importantly, we can look at this in alongside previous test results, or use it as a starting point for future tests to determine what kinds of subject lines are most effective.

Test #2: A Cool Call to Action

“So a client *almost* didn’t want to run an A/B split test… But they did and will get 400%+ in clickthroughs. Yay!”
@lonnietapia

Although a little more legwork than running an A/B split on a subject line, testing email content such as a call to action or button design is a swell way to generate more clicks and maybe even learn a bit about what elements work best in your email newsletter. Plus, the results can be compelling. As mentioned in Mark’s post, ‘I recommend (that) new email marketers test creative elements like adding call to action preheaders, call to action buttons and language, as these often provide big, immediate wins.’

As we’re going to stick with the easy stuff in post, lets look at an example from earlier this year – an A/B split campaign that used two different calls to action (CTAs). Can you guess which one won?

CTAs - Version A & B

Both seemed like strong candidates, but the test results really set them apart:
A/B results

The reports estimated that the Version A received an estimated 51% increase in clicks to the CTA as a result of being sent. That’s the difference a line can make!

Test #3: Which Image Wins?

Another key suggestion from the post was to, “…test (product) images: these can produce a big lift and are easy to do.” Like the above A/B split campaign, this also requires two versions of your email content, but as our friend, Anna Yeaman at Style Campaign observed from her A/B tests on animated vs. non-animated email newsletters, the results can be dramatic.

In her example, she created two versions of the same email, but one included an animated gif, while the other displayed a static frame instead:

Version A

Version A

Version B

Version B

 

After running a A/B split on email content, the winner (Version B) was sent and consequently racked up an estimated 26% increase in click-throughs – all thanks to the small animation above.

We’ve Only Just Begun…

So, we’ve featured some fairly simple examples here, but as the folks at Performable point out, A/B testing doesn’t necessarily have to be about small changes – you can also use A/B split tests to determine which email layouts work best, how tone of copy affects response and more. By testing two very different email designs, it is possible to, for instance, determine if a redesigned email newsletter is on the right track as a whole… But of course, this is quite different from determining which single CTA works best!

Also, it’s a good idea to not look at results from your tests in silo, but set controls, look for trends and take into account the bigger picture. For example, variables such as the time of day or time of year can affect campaign results – especially now, during the holiday period – so learnings from June’s campaigns may not be as applicable when sending in December.

Finally, as we’ve seen above, testing doesn’t have to be a time-consuming process and ultimately, should be seen as an ongoing, recordable feature of your campaigns. Even if you’re simply making small changes to your subject lines, at least you’re making improvements… And learning a little something more each time you send.

So now, it’s over to you. We’d love to hear about what tests that you regularly perform on your campaigns – best, worst, or weirdest – or what you’re planning to try, so fill us in via the comments below.

This blog provides general information and discussion about email marketing and related subjects. The content provided in this blog ("Content”), should not be construed as and is not intended to constitute financial, legal or tax advice. You should seek the advice of professionals prior to acting upon any information contained in the Content. All Content is provided strictly “as is” and we make no warranty or representation of any kind regarding the Content.
Straight to your inbox

Get the best email and digital marketing content delivered.

Join 250,000 in-the-know marketers and get the latest marketing tips, tactics, and news right in your inbox.

Subscribe

Get started with Campaign Monitor today.

With our powerful yet easy-to-use tools, it's never been easier to make an impact with email marketing.

Try it for free