Testing, Testing, 1 2 3 (or A and B)

 

You’re sending out emails, but how do you know what’s working? 

To optimize your engagement and understand what drives results, email testing should be a part of any strategic plan.

The most common approach, A/B email testing, compares two versions of an email to determine which performs best. 

The process, although a little time-consuming at first, is easy to do, and it will give you results that can help you shape your communications and work smarter.

 

Decide what to test 

What are your objectives and goals? While you might have a lot of great test ideas, remember to test one thing at a time.

Need to boost your open rates and get more eyes on your email content? Try subject line testing. Here a few ideas to get you started:

  • First name personalization
  • Length
  • Time of day
  • Tone
  • Special characters

Looking to drive more clicks? Test your email’s CTA (call to action), the layout, or the styling:

  • CTA button placement
  • Header creative
  • CTA button style
  • CTA copy
  • Number of CTAs

 

Split your audience 

You’ll need a test and control group. You don’t have to split your audience into two equal groups, but neither group should be too small. The split also needs to be randomized. Many email service providers will create the random split for you.  Otherwise, pick an unbiased user variable, like last name starts with A-J, to split your audience.  This will ensure your results are statistically valid. 

 

Let the test run

After you send your first email, allow for enough time to collect a good sample size of data — at least several days. 

 

Analyze your results 

Determine whether the numbers you’re seeing are statistically significant. If math isn’t your thing, plug your results into an online calculator like https://neilpatel.com/ab-testing-calculator/.  Only pay attention to the metrics that are applicable to your test. For instance, you’ll want to compare open rates for subject line tests, and look at click-to-open rates when you test your CTA (call to action). 

  • Open Rate =
 (Number of email opens) / (Number of emails delivered)
  • Click to Open Rate =  (Number of clicks) / (Number of emails opens)

 

Do something with your results

Implement the learnings from your testing into upcoming emails. If you don’t get statistically significant results, try testing again or changing a test variable. 

Don’t be afraid to retest old results. User behavior is always changing, so what proved to be true two years ago might have changed. 

 

Reach out to us

If you have questions, would like help setting up an A/B email test, or would like to see Marketing Communications’ past test results, contact Britt Olsen, Marketing Communications’ digital marketing specialist, at britt.olsen@ku.edu.