AppsFlyer A/B testing guide

AB_TESTING.jpg

What is A/B testing?

A/B testing is a method used by marketers to test different elements of their ads and campaigns. Each ad contains elements (targeting, visual, texts), which may affect the conversion rates of potential users. Finding the best performing variations of your ad elements may have a significant impact on the bottom line of your mobile marketing efforts.

This article explains how advertisers can perform A/B tests using AppsFlyer, to find the best performing ad elements.

How to A/B test mobile campaigns

To A/B test an element, split your mobile traffic 50/50 between two sources showing the identical offer. Next, create two variations of the element that you want to test. After sufficient traffic has flowed through, you can compare the results of your existing version (control) to the new version (variation).

The following steps apply to an A/B test you run using AppsFlyer.

Setup:

  1. Select the element to test.
    In this article, we explain about A/B testing creatives, ad placements, media sources, and campaigns.
  2. Create two identical campaigns (or ads), and then vary a single element within the ad. For example,  call-to-action button copy.
    • To test the impact of the variation on install conversion rates, split the traffic equally between the two campaigns. 
      If full impression data is available for both sources, divide installations by impressions to calculate the install conversion rates.
    • To test the impact of the variation on the quality of users, post-installation, you can send different amounts of traffic to the two campaigns.
  3. Drive sufficient traffic to each campaign to reach statistical significance, that means at least 50 installs per campaign. Use the A/B testing significance calculator.

Analysis:

  1. Use AppsFlyer's dashboard and data to compare the performance of the key KPIs between the two campaigns.
  2. Select the winning variation and use it in your live campaigns.

A/b testing creatives and targeting

Creatives contain many elements with varying impact on user conversion rates, including copy, images, colors, fonts, buttons, etc. Targeting parameters, like country, age, or gender, may also have a major impact on conversion rates. A/B testing these elements can help you find which factors have the biggest impact on improving user conversion rates.

Below we explain two methods of A/B testing using:

  • A single media source
  • Multiple media sources
Method Pros Cons
A/B testing with a single media source
  • Quick setup
  • Easy to test install conversion rate
  • Results may only apply to users of the specific media source
  • May take longer to test
A/B testing across multiple media sources
  • Fast results
  • Results apply to users from all media sources
  • Need to update attribution links in multiple places

Method 1: A/B testing with a single media source

Setup:

  1. Select a trusted media source with sufficient traffic.
  2. Create two campaigns in AppsFlyer by using one of the following methods:
    • Vary the campaign that is the name that is "c" parameter in the attribution link
    • Alternately you can create two ad sets ("af_adset=") within a campaign or single ads ("af_ad=") within an ad set.
  3. Use the same creative in both campaigns, except for the tested variation.
  4. Send the same amount of traffic to both campaigns.

Analysis:

  • Compare KPIs (for example, install conversion rate, ROI, ARPU, loyal user rate) of the campaigns on AppsFlyer's dashboard overview page.

Method 2: A/B testing across multiple media sources

Setup:

  1. Select a subscriber parameter to hold the variation value.
    For example, to test 2 CTA text variations you can use af_sub1="install now" or af_sub1="please install". More about subscriber parameters and a creative test example here.
  2. Add the subscriber parameter with the matching value to every live attribution link.
  3. It's recommended to use the same creatives in all campaigns, except for the tested variation.

Analysis:

  1. Download the installation or in-app events raw data reports. Use your selected subscriber parameter column to filter and sort your data.
  2. Compare the performance of creatives with either of the 2 stored values.

A/b testing ad placement on mobile web

ad_placement.png

The location of banners for your apps on mobile web sites may significantly impact user conversion rates. In most cases, advertisers use their own mobile web sites to convert visitors into mobile users.

With AppsFlyer's Smart Banners feature, you can set up mobile web banners quickly and easily. In addition, the feature also enables you to easily run A/B tests to determine the best placement for converting web site visitors into mobile users.

Setup:

  1. Implement Smart Banners in your mobile web site.

  2. For each new visitor, determine whether to show the banner at the top or bottom of the page. Make sure to split the traffic 50/50 between the two.
    Developer Tip - A simple method for splitting your web site traffic is to use a visitor counter, and then check whether the counter value is odd (option A) or even(option B) for each new visitor.

  3. Set the banner location and attribution link according to the determined location using the Smart Banners' SDK API ShowBanner.
    Example:
    showBanner({ bannerContainerQuery: "#document.body", additionalParams:{ af_sub5: "top"}})

Analysis:

  1. Download the raw data reports for installs or in-app events. Use your selected subscriber parameter (e.g. af_sub5) column to filter and sort your data.
  2. Compare the performance of the two ad placements, i.e. which placement got more installs or post-install events.

A/b testing media sources

Helping advertisers identify the best (and worst) performing media sources for their apps is what AppsFlyer was built for. However, if advertisers fail to compare media sources correctly, they risk reaching the wrong conclusions about traffic quality.

Setup:

  1. Create a single ad creative for BOTH media source.
  2. Define an ad for each media source, with the same exact audience targeting parameters.
  3. You can send different amounts of traffic to the 2 media sources.

Analysis:

  1. If the ads being tested are the only ones running on both media sources, use AppsFlyer's dashboard and data to easily compare your key KPI between the two sources.

    Otherwise:
  2. Go to AppsFlyer's Retention Report, Cohort Report, Master API or Pivot Table.
  3. Group by Media Source -> Campaign -> Ad set -> Ad.
  4. Compare KPIs to identify the top performing media source.

 Example

  • 1 and 2 are different ads on the same media source. Compare a single KPI for them, e.g. retention day 4 rate, to A/B test creatives, targeting or ad placement.
  • 2 and 3 are ads on different media source. If the creatives and targeting of the ads is identical, comparing a single KPI enables A/B testing the media source.

Using audiences split for easy A/B testing of media sources

If you set the same targeting parameters for each media source, the test includes the targeting abilities of the networks. When A/B testing, it’s best to test a single variation each time. As a result, it's advisable to eliminate the differences in targeting abilities, when A/B testing other factors in the performance of networks.

To achieve this, we recommend splitting an audience randomly and targeting both audiences with the same creative, on each tested network. AppsFlyer offers an easy method for defining, splitting and automatically distributing audiences between different media sources. Read about it here.

Was this article helpful?
0 out of 0 found this helpful