At a glance: Use AppsFlyer to A/B test media sources, campaigns, creatives, targeting, and ad placement.
What is A/B testing?
A/B testing is a method used by marketers to test different elements of their ads and campaigns. Each ad contains elements (targeting, visual, texts), which may affect the conversion rates of potential users. Finding the best performing variations of your ad elements may have a significant impact on the bottom line of your mobile marketing efforts.
This article explains how advertisers can perform A/B tests using AppsFlyer, to find the best performing ad elements.
How to A/B test mobile campaigns
To A/B test an element, split your mobile traffic 50/50 between two sources showing the identical offer. Next, create two variations of the element that you want to test. After sufficient traffic has flowed through, you can compare the results of your existing version (control) to the new version (variation).
The following steps apply to an A/B test you run using AppsFlyer.
Setup:
- Select the element to test.
In this article, we explain about A/B testing creatives, ad placements, media sources, and campaigns. - Create two identical campaigns (or ads), and then vary a single element within the ad. For example, call-to-action button copy.
-
To test the impact of the variation on install conversion rates, split the traffic equally between the two campaigns.
If full impression data is available for both sources, divide installations by impressions to calculate the install conversion rates. - To test the impact of the variation on the quality of users, post-installation, you can send different amounts of traffic to the two campaigns.
-
To test the impact of the variation on install conversion rates, split the traffic equally between the two campaigns.
- Drive sufficient traffic to each campaign to reach statistical significance, that means at least 50 installs per campaign. Use the A/B testing significance calculator.
Analysis:
- Use AppsFlyer's dashboard and data to compare the performance of the key KPIs between the two campaigns.
- Select the winning variation and use it in your live campaigns.
A/B testing creatives and targeting
Creatives contain many elements with varying impact on user conversion rates, including copy, images, colors, fonts, buttons, etc. Targeting parameters, like country, age, or gender, may also have a major impact on conversion rates. And for iOS 15+, you can also have up to 35 custom product pages. A/B testing these elements can help you find which factors have the biggest impact on improving user conversion rates.
Below we explain two methods of A/B testing using:
- A single media source
- Multiple media sources
Method | Pros | Cons |
---|---|---|
A/B testing with a single media source |
|
|
A/B testing across multiple media sources |
|
|
Method 1: A/B testing with a single media source
Setup:
- Select a trusted media source with sufficient traffic.
- Create two campaigns or Smart Banners in AppsFlyer by using one of the following methods:
- Vary the campaign that is the name that is "c" parameter in the attribution link.
- Alternately you can create two ad sets ("af_adset=") within a campaign or single ads ("af_ad=") within an ad set. Note: In Smart Banners, you can also A/B test by putting different parameters for each banner within the same banner group.
- For iOS 15+, you can also test the efficacy of different custom product pages, by sending different OneLink URLs to different product pages. Learn more
- Use the same creative in both campaigns, except for the tested variation.
- Send the same amount of traffic to both campaigns.
Analysis:
- Compare KPIs (for example, install conversion rate, ROI, ARPU, loyal user rate) of the campaigns in the Overview dashboard page.
Method 2: A/B testing across multiple media sources
Setup:
- Select a subscriber parameter to hold the variation value.
For example, to test 2 CTA text variations you can use af_sub1="install now" or af_sub1="please install". More about subscriber parameters and a creative test example here. - Add the subscriber parameter with the matching value to every live attribution link.
- It's recommended to use the same creatives in all campaigns, except for the tested variation.
Analysis:
- Download the installation or in-app events raw data reports. Use your selected subscriber parameter column to filter and sort your data.
- Compare the performance of creatives with either of the 2 stored values.
A/B testing ad placement on mobile web
The location of banners for your apps on mobile websites may significantly impact user conversion rates. In most cases, advertisers use their own mobile websites to convert visitors into mobile users.
With the AppsFlyer Smart Banners feature, you can set up mobile web banners quickly and easily. In addition, the feature also enables you to easily run A/B tests to determine the best placement for converting website visitors into mobile users.
Setup:
-
Implement Smart Banners in your mobile website.
-
For each new visitor, determine whether to show the banner at the top or bottom of the page. Make sure to split the traffic 50/50 between the two.
Developer Tip - A simple method for splitting your website traffic is to use a visitor counter, and then check whether the counter value is odd (option A) or even(option B) for each new visitor. - Set the banner location and attribution link according to the determined location using the Smart Banners' SDK API ShowBanner.
Example:showBanner({ bannerContainerQuery: "#document.body", additionalParams:{ af_sub5: "top"}})
Analysis:
- Download the raw data reports for installs or in-app events. Use your selected subscriber parameter (e.g. af_sub5) column to filter and sort your data.
- Compare the performance of the two ad placements, i.e. which placement got more installs or post-install events.
A/B testing media sources
Helping advertisers identify the best (and worst) performing media sources for their apps is what AppsFlyer was built for. However, if advertisers fail to compare media sources correctly, they risk reaching the wrong conclusions about traffic quality.
Setup:
- Create a single ad creative for BOTH media sources.
- Define an ad for each media source, with the same exact audience targeting parameters.
- You can send different amounts of traffic to the 2 media sources.
Analysis:
- If the ads being tested are the only ones running on both media sources, use AppsFlyer's dashboard and data to easily compare your key KPI between the two sources.
Otherwise: - Go to AppsFlyer's Retention Report, Cohort Report, Master API or Pivot Table.
- Group by Media Source -> Campaign -> Ad set -> Ad.
- Compare KPIs to identify the top-performing media source.
Example
- 1 and 2 are different ads on the same media source. Compare a single KPI for them, for example, retention day 4 rate, to A/B test creatives, targeting, or ad placement.
- 2 and 3 are ads on different media sources. If the creatives and targeting of the ads are identical, comparing a single KPI enables A/B testing the media source.
Using audiences split for easy A/B testing of media sources
If you set the same targeting parameters for each media source, the test includes the targeting abilities of the networks. When A/B testing, it’s best to test a single variation each time. As a result, it's advisable to eliminate the differences in targeting abilities, when A/B testing other factors in the performance of networks.
To achieve this, we recommend splitting an audience randomly and targeting both audiences with the same creative, on each tested network. AppsFlyer offers an easy method for defining, splitting, and automatically distributing audiences between different media sources. Read about it here.