A/B Test

When it comes to engagement, even small variations to your automation emails can have a big impact. With A/B testing you have a way of determining what your participiants are positively responding to, giving you a better insights to optimise your automation emails.

What Is A/B Testing?

A/B testing is a way to compare two versions of a single variable, (in this case, the subject line of an email), by testing a Subscriber’s response to variable A against variable B.

In layman terms it means that you create two subject lines for an email (Version A and Version B) and ReferralHero will send a random version to each subscriber. This ensures that the test is randomised and hold statistical significance.

How to run an A/B test

ReferralHero allows you to A/B test the Subject Line of your Automation emails. To enable A/B testing for an Automation email:

  • go to your campaign dashboard > Automations > email you want to use for the A/B test (e.g: the Welcome email)

  • switch the toggle A/B the Subject Line

  • a second field for the Version B of your subject line will appear

  • enter an alternate subject line

  • save your changes

How to stop an A/B test

To stop an A/B test simply switch off the toggle A/B the Subject Line and save the changes.

IMPORTANT When you stop a running A/B test all the stats about the test will be lost. Before you cancel it, make sure to check the report (see section below)

How to determine a winner

After you start an A/B test for an email, ReferralHero will automatically start collecting stats about the test. At any moment you can view the report by clicking the Report button in the top right corner.

Since we are only A/B testing the subject line, the number we are interested in is the open rate (shown in brackets). In this case, Version A has a 30.9% open rate versus a 57% open rate of Version B. Version B is the winner version. You should stop the A/B test and use Version B's subject line as your main subject line.

When should you be confident to pick a winner? It's really up to you to decide when an A/B test produces significant results, but we think you can confidently pick a winner when you have sent at least 1K emails for each version and the difference between the open rates is at least 10%.