A/B Testing Automations
When it comes to engagement, even small variations to your automation emails can have a big impact. With A/B testing, you have a way of determining what your participants are positively responding to, giving you better insights to optimise your automation emails.
What Is A/B Testing?
A/B testing is a way to compare two versions of a single variable (in this case, the subject line of an email), by testing a Subscriber’s response to variable A against variable B.
In layman terms, it means that you create two subject lines for an email (Version A and Version B), and ReferralHero will send a random version to each subscriber. This ensures that the test is randomised and holds statistical significance.
How to run an A/B test
ReferralHero allows you to A/B test the Subject Line of your Automation emails. To enable A/B testing for an Automation email:
Go to your campaign dashboard > Edit Campaign > Automations > email you want to use for the A/B test (e.g: the Welcome email)
Switch the toggle A/B Test Subject Line
A second field for Version B of your subject line will appear
Enter an alternate subject line
Save your changes

How to stop an A/B test
To stop an A/B test, simply switch off the toggle A/B Test Subject Line and save the changes.
IMPORTANT When you stop a running A/B test, all the stats about the test will be lost. Before you cancel it, make sure to check the reports located in your account, Analytics -> Automations.
How to determine a winner
After you start an A/B test for an email, ReferralHero will automatically begin collecting stats. You can view the report at any time by navigating to Analytics > Automations.

Since we are only A/B testing the subject line, the number we are interested in is the open rate. In this case, Version A has a 77.8% open rate versus a 60% open rate for Version B. Version A is the winning version. You should stop the A/B test and use Version A’s subject line as your main subject line.
When should you be confident enough to pick a winner?
It's really up to you to decide when an A/B test produces significant results, but we think you can confidently pick a winner when you have sent at least 1K emails for each version and the difference between the open rates is at least 10%.
Last updated
Was this helpful?