Email & SMS

How to A/B Test Email Campaigns: Ideas & Best Practices

By Tinuiti Team

In today’s fast-paced world of digital marketing, A/B testing has become a mainstay for brands looking to optimize their email campaigns. While many are using this technique, not all are harnessing its full potential. In this blog, we’ll dive into how A/B testing, once a niche approach, has now become a cornerstone of email marketing. We’ll uncover the common pitfalls that limit its effectiveness and offer insights into maximizing the overall benefits of A/B testing. 

It’s clear that A/B testing is the best way to continually refine email campaign strategies over time and improve performance through data-driven decision making, but how can we perfect this strategy as marketers? Let’s jump into the basics…


What’s A/B Testing in Email Marketing?


A/B testing is also called split testing or variable testing. When you conduct an A/B test, you compare two variables to see which one performs better. Email A/B testing uses this process to test variations of emails or email campaigns against each other to see which performs better for a specific metric, such as open, click-through or conversion rates.

Companies typically employ this technique by segmenting their email list into two groups, version A and version B, to test different variations of an email. Often, the split is even, or it can be done as a 10/10 split, with the remaining 80% of the list receiving the winning version. More advanced approaches include using holdout groups, where emails are tested on a subset (i.e: 10% or 20%) of the regular email list for that specific segment. 

After a designated time frame, typically 1 or 2 hours, sufficient data is collected to determine which version performs better. The winning version is then sent to the remaining members of that segment, helping email marketers refine their campaigns for maximum effectiveness.


You can test simple or complex variations. Here are a few examples of A/B email testing:





Benefits of A/B Testing Emails


A/B testing for email campaigns doesn’t have to be difficult, especially when you use the right software and other tools. These tests are highly effective especially considering more than 50% of marketers use A/B testing to boost conversions. But conversion isn’t the only benefit to A/B testing… 


A/B testing has other benefits, including:






Of course, there are several things to consider when A/B testing. You can only test one variable or element at a time. Otherwise, you don’t know which element is responsible for any improved performance. It’s also important to keep in mind that new privacy regulations from Apple will likely spread to other providers making it difficult to track open rates consistently. This means you should focus on KPIs like clicks and conversions. 


Try A/B Testing These 8 Email Elements


If you want to put email A/B testing to work for your organization, we’ve got eight variables you may want to test. Remember, choose one at a time when you set up your tests — otherwise, you muddy your data and won’t get any actionable insight. 


1. Subject Lines


The subject line is one of the most important elements of any email because it’s a major factor in whether someone opens the email or not. This element typically shows up in bold right under the sender name or in another prominent location in the inbox.

Subject lines are a common variable of A/B testing because they’re so powerful and because they’re easy to test. You simply send the same email with different subject lines.


Here are some ideas to try when A/B testing email subject lines:





You can also test the inclusion of emojis, symbols or punctuation as well as asking questions.


2. Preview Text


The preview text, also commonly referred to as pre-header text, is a snippet, summary or sneak peek into email contents. It’s also called the preheader line, and it shows up under the subject line on some devices. It’s not as powerful as the subject line itself, but you will only know if it matters to your audience if you test it.


During this phase consider testing:





3. Sender Name


Sender name is what shows up in the “From:” field in an email. Emails sent by your brand might show up as “From: ABC Brand,” for example. Or you might create emails that come from specific people: “From: Sue at ABC Brand.”

What sender name will help build a personal connection with your audience best? You can’t know that until you conduct some A/B email testing.


Consider testing options such as:





4. Send Time


Google the best time to send emails and you’re likely to run across multiple articles stating that Tuesday afternoons are the ticket. In reality, Tuesday afternoons work best for some businesses. That doesn’t mean it will work best for you.

The only way you can know what day and time is best for your audience is to split test by sending emails at different times and narrowing it down for yourself.

You also have to account for trigger emails, which can’t all be sent on Tuesday afternoons. For example, you may find that cart abandonment emails work best when sent 2 hours after the person puts an item in the cart and welcome emails work best 10 minutes after sign up. Note that these aren’t recommendations; they’re examples. Run the tests for yourself to find out what works for your audience.

When you’re running A/B testing on email send times, remember to segment by time zone if possible. That way, you can figure out what’s best for each subsection of your audience.


Ideas for A/B testing email send times include:





5. Call to Action


The CTA tells the email reader what to do next, so it’s pretty important. A/B testing helps you improve CTAs to improve click-through rates.


Consider testing:







6. Email Copy


Most marketers agree that short and sweet is best when it comes to email copy. In fact, it can be a good idea to ensure you concentrate on a single idea in an email marketing message. Your copy also needs to be engaging and grab the attention of the recipient.

Of course, “attention-grabbing” is a subjective description, and what captures the eye of one audience won’t engage another. A/B testing helps you determine what copy works best for your audience. 

Test factors such as the length of your copy, the words and style of writing you use, whether you include personalization and the tone. For example, does your audience respond better to formal or informal writing?


7. Email Design & Layout


It only takes a couple of seconds before someone decides whether to continue reading your email or not. Email readers definitely judge the book by the cover, so to speak, so your design and layout matter.

Test out design and layout variations such as whether you include plain text or HTML or send emails with simple designs or messages with many bells and whistles.

An email marketing design that resonates with your audience improves click-through and conversion rates. It can also increase brand awareness and create positive downstream effects on marketing efforts outside of email. 


8. Images


If you have a little experience with social media marketing, you know that images are powerful. Facebook and Instagram posts with images get much more engagement on average than text-only posts. The same can be true for email.

A/B testing can help you understand where the line on images is for your audience.


Some ideas for A/B testing images in email include:







How to Run Better A/B Tests for Email Campaigns


Now that you have plenty of ideas for A/B email testing, let’s look at a few tips for running the best split tests you can.


Determine Which Variable to Test


Start by thinking about the business goal you want to meet. For example, if you want to increase conversion rates, then you might want to work on optimizing your CTAs, as they directly relate to conversions. Subject lines impact open rates and images and email copy can improve engagement and positive brand affinity.

Once you start working on a specific variable, repeat the test across different emails. This lets you collect more data and normalize it. Otherwise, other factors could inadvertently impact your test. 

You should also test with the right type of email. If you’re trying to figure out when the best time for cart abandonment emails is, testing with your monthly subscriber newsletter is pretty useless.


Select a Random Sample of Users


Select users randomly for tests and avoid using the same people for every test. Most businesses can test with around 20% of their list. However, if you only have a few hundred subscribers, 20% of that number won’t lead to statistically significant conclusions. In these cases, test with about 80% of your list.


Run the Test


It may seem simple, but the final step is actually running the test. As previously mentioned, between 30 and 50% of organizations don’t get to this step.

Be patient as you wait for results. If you’re looking at a metric like open rates, you may have a pretty good idea of performance in just a few hours — and you definitely know which variation was a winner within a day. 

But other metrics, such as click-through and conversion rate, take longer to measure. That’s because someone may open your email and decide to come back to it later or think about your offer. For these types of metrics, you may need to let the test run for a few days to ensure you have a good sampling. 


Final Takeaway


By leveraging A/B testing, you can support email marketing campaigns that perform better. Looking to learn more about this process? Reach out to our Tinuiti Email & SMS experts today.


You Might Be Interested In

*By submitting your Email Address, you are agreeing to all conditions of our Privacy Policy.