9 A/B email tests that help you sell more

You know that email marketing has an unbelievable ROI — and you know you need to test and optimize your content to get the best results.

But knowing is the easy part.

The harder part? Actually doing it.

A/B testing can feel overwhelming, because you can test anything: from subject lines to pitches to design. It’s why 39% of brands skip the testing phase of their broadcast or segmented email marketing.

In this article, we’ll dig into 9 A/B tests that’ll help you convert more subscribers — without wasting your time or efforts.

How does A/B testing work?

(For the uninitiated)

If you’re already an A/B testing wizard, go ahead and skip to the next section.

But if you’re new to it, keep reading.

A/B testing, also known as “split testing,” is a great way to improve your content over time. You can use A/B testing for emails, landing pages, SMS messages, and tons of other content types. The idea is to test one specific element of your content to see if it outperforms the original version.

Most email marketing platforms come with a built-in A/B testing feature that makes it easy. Here’s how it typically goes:

  1. Create an email.

  2. Duplicate the email or click a simple “A/B test” button in your email platform.

  3. Change one thing about the duplicated email (for example, a different subject line).

  4. Send both emails at the same time and split your audience — so half of your subscribers receive the original version, while the other half receive the version with the new subject line.

  5. Track performance to find out which content got the best results. (If you’re testing a subject line, for example, you’d want to look at which version got the highest open rates.)

  6. Apply your learnings to future content.

And that’s it! Pretty simple. 

Here are just a few benefits of using A/B testing in your email marketing:

  • Data-backed strategy improvements (no guesswork required)

  • Better understanding of your audience’s interests and psychological motivators

  • Affordability - you’re already sending emails, so why not get free insights from them too?

  • Improves the right stats (you can choose what matters most to you)

  • Edge over competition (use 39% stat)

If you aren’t using A/B testing in your marketing content already, I highly recommend working it into your process. It’s the easiest way to see what your audience likes and responds to.

Now that we’ve covered the basics, let’s dig into the dirty deets.

Why so many A/B tests fail

In theory, A/B testing is simple. But in practice, it’s more complicated.

That’s why a lot of us launch A/B tests that teach us absolutely nothing. We understand the simple concepts behind how it works — but we forget the smaller details that make or break our results.

To make sure you get actual value from each test, here are a few mistakes to avoid:

  • Giving up too soon: The best learnings are backed by lots of data. The more patient you can be with each A/B test, the more data you’ll collect — and the more value you’ll walk away with.

  • Getting attached: As a copywriter, I completely get this one. Sometimes you set up both sides of your test… but you really want “A” to win over “B.” Don’t let this cloud your judgment; your customers are the ones who ultimately decide what works best.

  • Leaving out your hypothesis: While you don’t want to get attached, you do need to go into each test with an idea of what you think will work. It doesn’t matter if you end up being right or wrong. But it’s important to define your current ideas and approach before you test whether it’s correct.

  • Not collecting enough data: This is similar to the “Giving up too soon” point, but not quite the same. Whether your test is live for one week or two years, you want to collect enough data to create statistical significance; the larger your sample size, the more confident you can be that your results didn’t happen by chance.

  • Skipping your follow-up tests: If you learn something from your A/B test, you’ve mastered step one. But most people forget about step two: creating a second, similar test to see if your results stay the same.

  • Testing things that don’t make the impact you want: Say you want to increase your email click-through rate. Testing your subject lines won’t help you do that, because they only impact whether someone opens your email — not whether they click on the CTA inside.  

  • Missing the “Goldilocks” tests: If you’re testing something too small, like a slightly different font size, it won’t give you much insight. If you’re testing something too big, like the entire body copy of your email, it’ll be hard to know exactly what caused the difference in performance. Go for tests that are just right.

  • Not applying your learnings: It’s crazy how many marketers conduct A/B tests… then continue what they’re doing without changing a thing. If you’re going to take the time to test your performance, make sure you’re using your results to inform future campaigns.

  • Testing multiple things at once: Say email A has a different hook, design, and CTA than email B. If version A performs better, you won’t be able to tell if it’s because of the hook, design, or CTA. Limit your tests to just one variable so you know for sure what’s driving the change.

 

Now for the the nitty gritty

Now that we’ve got the big-picture tips taken care of, let’s zoom in.

Here are 9 A/B tests that are actually worth your time if you’re looking to sell more through your email marketing.

 

1. Subject lines

This is probably the most common A/B test you’ll hear about in email marketing. The trick for a great subject line test is to define the reason behind it — and get super specific about your hypothesis.

It’s easy to write two subject lines, slap them into a test, and see which one gets more opens. But without a question and hypothesis, what does that mean?

Here are a few examples of good (and bad) subject line tests:

Questions vs. statements

Good:

  • A: “Are your emails boring?”

  • B: “How to tell if your emails are boring”

Bad:

  • A: “Are your emails boring?”

  • B: “If your emails don’t convert, do this”

    ^This example could muddy up the results by saying “convert” instead of “boring.” What if “convert” is just a stronger word?

Numbers vs. no numbers

Good:

  • A: “9 A/B tests to try this week”

  • B: “Try these A/B tests this week”

Bad: 

  • A: “9 A/B tests to try this week”

  • B: “Have you tried these A/B tests?”

^This example brings a question into the mix. You’d either want to use something with a format that matches version A (like the first example), or adjust version A to be a question as well (“Have you tried these 9 A/B tests?”). 


One emoji vs. another emoji

Good:

  • A: “Special offer inside! 😍”

  • B: “Special offer inside! 👉”

Bad: 

  • A: “Special offer inside! 😍”

  • B: “Special offer inside! 🎉”

^Both of these emojis convey excitement, which doesn’t make a huge impact on your open rates. With the first example, you’d be testing excitement vs. direction, which can easily be applied to future emails.

Stat to track for subject line tests: Open rates

 

2. CTA format

There are only a few ways to format your email CTAs:

  • Hyperlinks

  • Buttons

  • Raw URLs

I think we can all agree that raw URLs aren’t ideal — they look a bit spammy and ugly, and don’t give you the chance to get creative about your CTA phrasing.

That leaves us with hyperlinks and buttons. 

I gave it a Google, and got two different answers:

  • Google’s AI overview says that buttons are clearer, more attention-grabbing, and more attractive to readers

  • A reddit user posted about how hyperlinks outperformed buttons over the course of 40 split tests 

Obviously, it’s worth seeing what clicks best with your own unique audience (pun fully intended).

The only catch: You want to make sure you keep your phrasing consistent in both your button and hyperlink version. For example…

  • Button: “Get your guide”

  • Hyperlink: If you’re ready to get started, get your guide here.

Stat to track for CTA formatting tests: Click-through rates

 

3. CTA phrasing

The way you phrase your CTA is just as important — if not more important — than the way you format it.

We already know that great CTAs should be clear, actionable, and urgent. But past that, it’s up to you to figure out what motivates your unique audience the most.

Just like your other A/B tests, you’ll want to make sure you keep both versions as similar as possible; the phrasing should be the only difference. This includes formatting.

Here are a few things you might want to test:

  • First-person vs. second-person voice:

    • “Get my guide” vs. “Get your guide”

  • Urgency vs. brevity:

    • “Get your guide now” vs. “Get your guide”

  • Outcome vs. clear instructions:

    • “Start selling more” vs. “Get your sales guide”


Stat to track for CTA phrasing tests: Click-through rates

 

4. Style + approach

Does your audience resonate with stories?

Stats and research?

Straightforward pitches?

Find out!

With this test, make sure you keep everything other than your body copy exactly the same between both versions — including your subject line, hook, design, CTA, copy length, etc.

How to test your email body copy

Use this approach to make sure you can attribute performance changes to the different angle you write (rather than other aspects of the email).

 

5. Length

How much is your audience willing to read?

Generally, not a lot — but it’s still worth testing out.

Test the ideal email length by the entire message the same between both versions. In one version, add more detail, stats, storytelling or another type of content that helps you support whatever pitch you’re making and provide more value to your readers.

This is the kind of test you can repeat over time until you find the *perfect* length for your readers. For example…

  • Test 1: 50 words vs. 250 words (250 words wins)

  • Test 2: 250 words vs. 500 words (250 words wins)

  • Test 3: 250 words vs. 100 words (100 words wins)

  • Test 4: 100 words vs. 200 words (100 words wins)

  • Test 5: 100 words vs. 150 words (150 words wins)

  • Result: 100-150 words is the ideal email length for your audience.

Tip: If possible, try to conduct these tests around the same type of email content (or the exact same email, if you can make it work). It can be a newsletter, a sales pitch, a welcome message, a nurture email… as long as it’s the same between all your tests, you’re doing it right.

Stat to track for email length tests: Click-through rates

 

6. Hooks

What grabs your audience’s attention the most?

This test is extra-important — because if you can’t hook your readers in the first sentence, they aren’t going to read the rest of your message. And they won’t make it to your CTA.

Make sure the hook is the only thing you change between both versions of your email. Here are a few hook ideas to test:

  • Questions

  • Storytelling

  • Stats and research

  • Jumping straight to your pitch

  • Teasing what they’ll get if they keep reading

Stat to track for email hook tests: Click-through rates

 

7. Send times 

When does your audience respond best to your messages?

If your list mostly consists of adults with 9-5 jobs, it might be evenings and weekends. If it’s college-aged shoppers, it might be late mornings or afternoons. If it’s entrepreneurs, it might be mornings at the end of the work week.

But no matter who you’re emailing — testing your send times is the only way to know for sure.

For this test, both email versions should be exactly the same. The only difference is when it goes out to each half of your list. 

Bonus: Once you nail down your perfect send times, you can start looking into your ideal sending frequency. For example, you can split your lead nurturing funnel to send the exact same 5 emails — but version A sends a daily email over 5 days, while version B sends a weekly email over 5 weeks.

Stat to track for send time + frequency tests: Open rates

 

8. Formatting + design

Email builders have come a long way in design and formatting options. Some brands do better with those fancy designs… but for others, a straightforward plain-text email gets more people to read and engage.

Try sending two versions of the same email: One with design, and one without.

Once you choose a winner, you can move on to test formats:

  • Does your audience like to see more images?

  • Bullet points and numbered steps?

  • Different brand colors?

The possibilities are endless.

Stat to track for formatting tests: Click-through rates

 

9. Emotions

Tapping into your readers’ emotions is a core part of copywriting — and with this test, you can figure out which emotions drive the most action.

This one can be tricky, because it’s more nuanced than simply changing a subject line or CTA copy. You’ll want to keep the overall message, structure, and content the same between both versions, while making small tweaks that cater to different emotions.

For example, here’s how you can change one sentence to appeal to different emotions (without changing its overall structure and message):

  • Fear:

    “Investing in email marketing feels daunting — you have limited time and resources, and you don’t want to waste them on something that doesn’t work.”

  • Inspiration:

    “Investing in email marketing feels exciting — you have something great to offer, and email is a free channel with untapped potential.”

  • Greed:

    “Investing in email marketing feels like a no-brainer — you don’t have to spend a dime, and you can expect a better ROI from your sales than any other channel.”

  • Anger:

    “Investing in email marketing feels relievinginstead of wasting ad dollars and burning out from social media, you can finally get results that match your efforts.”

 

You’ve got the ideas. But where do you start?

Now that you know what to test, you can use a little thing called the “ICE Framework” to decide what to do first.

Coined by Sean Ellis, the ICE Framework helps you prioritize the order of your A/B tests (or any task, for that matter) in terms of three factors:

  1. Impact: How big of a difference will this make for your bottom line?

  2. Confidence: How much certainty do you have about the impact it’ll make?

  3. Ease: How much work will it take to get it done?

To use this prioritization method, you’ll:

  1. Pick one test from above.

  2. Assign a score between 1-10 for each of the three factors.

  3. Combine those three scores to determine its priority (3 being the lowest priority, 30 being the highest).

  4. Repeat for each type of test.

  5. Compare the scores and start with the highest one. 

From there, you can tackle one A/B test at a time — and get one step closer to better email performance, increased conversion rates, and more sales.


👉 Need any help? I specialize in email marketing for businesses like yours. If you want professional feedback — or need someone to handle the whole dang process for you — here’s how I can help.

Previous
Previous

5 automated email marketing funnels to launch first

Next
Next

Should you pay for copywriting services?