Menu
 

A/B testing your triggered emails with Customer.io

A/B testing allows you to test changes to your emails and measure if the change you’ve made will perform better than the original. This means you can test subject lines against one another, or test a new design, or the text you’re using for your call to action. The sky is the limit.

You can test multiple emails in every triggered campaign. We recommend sticking to making just one change in your variation. This allows you to get an accurate picture of how that change measures up against the original without being confused by other data points.

How do I set up an A/B test?

First things first, you’ll want to head to the campaign where you want to run your test. In this example we’ll be testing with a campaign called “My Segment Triggered Campaign”.

  1. Click on the “Workflow” tab at the top of the screen.
  2. Click on the email you want to add a test to.
  3. Press the “Add A/B Test” button.

Add a variation

By default, the A/B test is inactive as indicated by the small “Inactive” label, so 100% of the traffic goes to the original, and 0% to the variation. Leave this as-is while you make your change(s).

A/B Test inactive

Make your changes to the variation. You can change most anything about the email: the from address, the subject, the body, even the sending mode (“Queue Draft” vs. “Send automatically”. The delay and time window are shared across both versions, though.

If you have a low volume of emails being sent or are just starting with A/B testing, try testing a subject line change and measuring opens.

When your changes are ready, give the email a 50 / 50 split (or 80/20, 30/70…it’s up to you) by clicking the A/B test bar.

Start A/B test

You’ll now see an “A/B Test” tab appearing next to the “Goal” tab. That’s where you can go to see the results of the campaign and pick a winner.

A/B Tests tab

When you have statistically significant results, or if you want to end the test before that, you can pick one of the options as the winner by clicking the “Select winner” button. We’ll remove the other option and remove that particular A/B test from the screen.

Note:

After ending the test, you will no longer be able to see your non-winning content or your A/B test results.

Understanding your test results

Want more information about what statistical significance means or how we’re calculating different things? Check out Understanding Your A/B Test Results