What this article covers
A/B testing in ReferralCandy lets you compare two versions of a reward, friend offer, or welcome email side-by-side, then pick the version that drives more referrals based on real data.
A/B testing is available to both Shopify merchants (in the Shopify-embedded ReferralCandy app) and merchants on other platforms (via the standalone dashboard at my.referralcandy.com). The setup steps and behavior described below apply to both — only the entry path to the dashboard differs.
Two types of A/B testing in ReferralCandy
A/B testing in ReferralCandy comes in two distinct flavors, each accessed from a different button on the campaign's A/B Testing page.
1. Manual reward and friend offer test (Create test button) — You configure two variations of an advocate reward and/or a friend offer, run the test until you have enough data, then manually pick the winner. Use this when you want to compare specific reward values, types, tier structures, or friend-offer coupons.
2. Automated welcome email test (Run automated test button) — AI generates copy variations of your existing welcome email, splits sends 50/50, and picks a winner every 14 days before generating a new variant to test against. Use this when you want continuous, hands-off optimization of welcome email copy.
Both test types share the same 50/50 split. They can also run together — an automated welcome email test can run concurrently with a manual reward test on the same campaign.
A/B testing rewards and friend offers (manual test)
Manually compare reward variations or friend offers using the Create test button on the A/B Testing page. You can only run one manual test per campaign at a time.
What you can test
Advocate reward (or Affiliate reward for affiliate campaigns) — test across all reward types: cash (percentage or fixed amount), coupons, custom gifts, store credit, and Buy X Get Y.
Reward structures — test a multi-tier reward against a single-tier reward, or two multi-tier rewards with different numbers of tiers on each side.
Friend offer — test fixed-amount vs. percentage coupons. If your friend offer is set to Nothing, you'll be prompted to change it to a coupon before you can set up the test.
Only the reward type, value, and description are tested and compared. All other reward settings — expiry dates, minimum order amounts, discount-combination rules — stay the same for both variations.
Set up the test
From your ReferralCandy dashboard, open the campaign. For referral campaigns, scroll to Promote campaign and click A/B Testing. For affiliate campaigns, open Other settings > Advanced settings > A/B Testing.
Click Create test.
Enter a Test name.
In the A Control column, your campaign's current reward and friend offer are auto-populated. In the B Variation column, click Edit reward or Edit offer to set the variation you want to test.
Click Start test. The test begins immediately, and the test status changes to Active.
Note: While a manual test is running, you can't edit your campaign's current reward setup or change language settings. End the test to make those changes.
Manage and end the test
All your tests are listed on the A/B Testing page across three tabs: Active (live tests), Ended (completed tests), and All.
Manual tests must be ended manually when you've gathered enough data. Click an active test to view the side-by-side metrics:
First referral revenue — Total revenue from the first purchase that referred friends made.
First referred purchases — Total number of first purchases by referred friends.
Advocate (or Affiliate) shares — Total referral-link shares.
Advocate (or Affiliate) views — Total page and email views.
Advocate (or Affiliate) share rate — Percentage of unique advocates who shared the referral link. This is the primary winner-determination metric.
Store visits — Number of visits to your store.
Friend views — Number of friends who saw the friend offer.
Referral store visit rate — Percentage of friends who visited the store after clicking a referral link.
When you're ready, click Review and end test, choose the variation to apply to your campaign, then click End test. The chosen variation's settings are immediately applied.
Important: Ended tests can't be restarted or undone.
A/B testing the welcome email (automated test)
The automated welcome email test is AI-driven. You turn it on with one click; the system handles the rest. Your only customization point is editing the existing welcome email — AI generates the variations from there.
How it works
The system creates a smart variation of your existing welcome email.
50% of your contacts receive the existing email (Variant A); 50% receive the AI-generated variation (Variant B).
Results are evaluated every 14 days. The winner becomes the new control, and the system generates a fresh variation to test against it. If there's no clear winner, the current test continues.
The test runs continuously until you stop it.
Only the text of the welcome email is tested — subject line, body copy, and CTAs. Branding and visuals (logo, colors, fonts) stay the same across both variants.
Start the automation
From your ReferralCandy dashboard, open the campaign and go to A/B Testing (under Promote campaign for referral campaigns, or under Other settings > Advanced settings for affiliate campaigns).
Click Run automated test.
If you'd like to update the existing welcome email before the test starts, click Edit welcome email. Otherwise, the existing email is used as Variant A.
Click Start test. AI generates the variation in the background and the test begins.
Stop the automation
Go back to the A/B Testing page and open the active welcome email test.
Review the side-by-side stats — emails sent, click rate, and clicks — for both variants.
Click Stop test. Choose between Variant A (Control) and Variant B (AI-generated), then click End test. Your chosen version becomes the default welcome email.
Once stopped, you can start a new automated test whenever you're ready.
What advocates and visitors see during a test
A/B testing splits your audience 50/50 between Variant A and Variant B. The split is automatic — you don't pick which contacts go where.
For manual reward and friend offer tests
When an anonymous visitor lands on your sign-up page, they're randomly assigned to either the A bucket or the B bucket. Each visitor sees the reward and friend offer copy of their assigned variant — and only that variant. The assignment is sticky via cookie, so a returning visitor sees the same variant on subsequent visits.
When a visitor signs up, the variant they were shown is locked to their contact record. They stay in that bucket for the duration of the test, and you can see which variant a contact was enrolled in from the contact's activity log.
Klaviyo users: When a contact enrolls in a test, their assigned variant is exposed as a custom property in Klaviyo. You can use this property to trigger different Klaviyo flows for each variant.
Edge case: if a visitor's browser doesn't store the cookie (incognito/private browsing, cookies cleared), they may see a different variant on a subsequent visit before they sign up. Once signed up, the variant is locked.
For automated welcome email tests
When a new advocate enrolls during an active welcome email test, they're randomly assigned to receive either Variant A or Variant B. The split is also 50/50 across all enrolled contacts.
Constraints
A/B testing has a few prerequisites and limits to know about:
Live campaign: A/B tests only run on Live campaigns. If you stop or pause a campaign, any active tests pause too — they resume when you reactivate the campaign.
English-only language: A/B testing is only available for English-only campaigns. Multi-language campaigns will see an "A/B testing unavailable" banner. Learn more about language settings.
One manual test per campaign: You can only run one manual test (rewards and/or friend offer) at a time on a given campaign. End the active test before starting another.
Friend offer set to Nothing: If you want to A/B test the friend offer, change it from Nothing to a coupon first. The setup will block until you do.
A/B testing FAQ
Why can't I A/B test my campaign?
Why can't I A/B test my campaign?
A/B testing requires the campaign to be Live and to use English as the only language. Common reasons it isn't available:
Your campaign is paused or stopped. Learn about campaign states.
Your campaign uses multiple languages or a non-English language.
You already have a manual test running. End the active test before starting a new one.
For friend-offer testing: your friend offer is set to Nothing.
Can I A/B test branding elements in the welcome email?
Can I A/B test branding elements in the welcome email?
No. The automated welcome email test only varies the email's text — subject line, body, CTAs. Branding elements (logo, fonts, colors, banner images) stay the same across Variant A and Variant B.
Can I run a manual reward test and an automated welcome email test at the same time?
Can I run a manual reward test and an automated welcome email test at the same time?
Yes. The two test types are independent. You can have an automated welcome email test running while a manual reward or friend offer test is also active on the same campaign.
How is the winner determined?
How is the winner determined?
For manual reward and friend offer tests, the primary winning metric is advocate share rate (advocate shares ÷ advocate views) — the variant with the highest share rate is the winner. The system also factors in statistical significance and a minimum sample size before declaring confidence.
For automated welcome email tests, the AI evaluates engagement metrics (click rate, clicks) every 14 days and picks the better-performing variant.
