Incrementality guide

At a glance: Measure the incremental lift generated by retargeting campaigns.

Related reading: Running an incrementality experiment | Audiences | The 2021 incrementality testing guide for marketers

What is Incrementality?

Probably the best way to understand incrementality analysis is through an example:

Say you recently ran a remarketing campaign. For the sake of simplicity, we'll say it was with only one ad network. The campaign's conversion rate was 5%.

Now you want to evaluate the success of the campaign, and you need to answer important (and, of course, the most difficult) questions:

  • Is 5% a good conversion rate for this campaign?
  • How do we know that the campaign truly resulted in driving in additional conversions?
  • Was the campaign worth the cost of running it?

In order to answer these questions accurately and without bias, we must consider more than just measured conversions. We must take into account the number of users who would have converted organically, even if they had never engaged with a remarketing campaign.

This difference between total conversions and those that would have happened anyway is known as incremental lift (or sometimes simply "lift"), and it is critical to determining the true value of a campaign. It can be quite difficult to measure, but that is precisely what AppsFlyer Incrementality is designed to do.

Incrementality experiments

Incrementality overcomes the challenge of measuring incremental lift by allowing you to create scientifically valid experiments to compare the behavior of similar groups. These groups vary only in that some groups (the "test groups") are exposed to your campaign, and one group (the "control group") is not.

Learn more about creating and running Incrementality experiments

Calculating lift 

AppsFlyer Incrementality employs 2 alternative methods for calculating lift:

  • Intent-to-treat
  • Reach-based

While reach-based methodology is preferable for purposes of evaluating campaign performance, its calculation requires the measurement of data points not provided by all ad networks. For these networks, intent-to-treat methodology is used instead.


Prior to December 16, 2021, lift was calculated using solely the intent-to-treat method. However, data for earlier dates has been updated to utilize reach-based lift where applicable. Therefore, you can review historical experiments to analyze their results in light of the new methodology. 

ITT-based lift

Intent-to-treat (ITT) experimental methodology has its foundations in medical science. Using ITT, researchers randomly split a given population between a control group who are not treated and a test group whom researchers intend to treat. Researchers measure the efficacy of a given treatment by comparing the results of the groups. Whether or not a test group member actually received treatment is not taken into account.

Similarly, in remarketing, the lift metric measures campaign efficacy. It doesn't matter if a user in the test group actually engaged with a campaign (received treatment) because it is the intent that is relevant in calculating incrementality. In other words, results are calculated as if every member of the test group actually engaged with the campaign.

When is it used?

AppsFlyer Incrementality uses ITT-based lift calculation for SRNs.

Although it is not the preferred method for measuring lift in our context, ITT remains a scientifically sound experimental method, and it provides valid results when reach-based calculation is not possible.

A caution icon caution_icon.png is used on the Incrementality dashboard to indicate networks for which ITT methodology is used in calculating the displayed results.

Interpreting results

The underlying assumption that all members of the test group were actually reached by a campaign often results in low conversion rates and, as a consequence, low incremental lift.


Reach-based lift

As its name implies, reach-based lift calculation takes into account whether or not a member of the test group was actually reached (served a campaign ad) prior to converting. In addition to requiring additional data, reach-based lift calculation is also more complex in that it requires certain mathematical adjustments to the control group in order to be statistically valid. These adjustments are cumulatively known as the creation of a "counterfactual" control group.

When is it used?

AppsFlyer Incrementality uses reach-based lift calculation for non-SRNs (attribution-link networks).

While it is the preferred method for calculating lift, reach-based calculation requires that ad networks report the following data to AppsFlyer:

  • Impression data
  • Campaign ID (from the attribution link)

If this data is not reported, incrementality measurement is not available (and displays as N/A on the Incrementality dashboard).

Interpreting results

Since reach-based lift calculation takes into account only test group members that were actually reached by a campaign, networks that are measured using reach-based incrementality will usually show amplified results (positive lift will become much higher, and negative lift will become much lower).


Best practices

Because the 2 different lift calculation methodologies can yield significantly different results, the following best practices are recommended in order to make interpretation more meaningful:

  • When creating your experiments, select audiences that are split only to attribution-link networks.
  • In your initial experiments, consider using an audience split only between a single network and a control group. This makes it less likely that comparisons will be made among 2 or more networks for which different lift calculations are used.

Statistical significance (p-value)

When evaluating the validity of incrementality experiments, sound scientific methodology requires that we consider statistical significance (the level of certainty that the lift result isn't due to random events or chance). 

 The science behind significance

Statistical significance is measured using a p-value, which represents the probability that results are more than 90% likely to repeat themselves if the experiment were replicated. An experiment with a p-value of less than 0.1 (typically ≤ 0.1) is considered statistically significant, indicating less than a 10% probability that the results are random.

The Incrementality dashboard

To access the Incrementality dashboard:

  1. In AppsFlyer, go to Dashboard > Incrementality.
  2. Set filters to display the required metrics.
  3. Use display controls to:
    • Select different metrics
    • Display options according to those described in the following tabs 


Component Description



Audience tested by the experiment (from AppsFlyer Audiences)
The same audience can be used to target more than one app

First seen dates

Date on which a user first matched audience rules and was then added to the audience

Targeted app The app advertised by the remarketing campaign
Media sources Ad networks displaying ads to test group members


Metrics are calculated according to active filters.

Component Description
Selected metric
Metric Description
Conversion rate 
  • Test: Converting users/Test group users
  • Control: Converting users/Control group users
  • Lift (Test CVR %–Control CVR %)/Control CVR %
  • Difference: Test–Control
Revenue per user for a given event
  • Test: Revenue of converted users/Converting users
  • Control: Revenue of converted users/Converting users
  • Lift (Test revenue per user–Control revenue per user)/ Control revenue per user
  • Difference: Test–Control
  • Media sources: Metric per media source metrics calculated similarly
Events per user for a given event 
  • Test: Number of events of test group users/Test group users =(a)
  • Control: Number of events of control group users/Control group users=(b)
  • Lift: (a–b)/b
  • Difference: a–b
Group size
  • Test: Test group users
  • Control: Control group users
  • Targeted media sources:
    • Distribution of users in %
    • Allocation displays in a hover


  • Measures the probability that the experiment result (lift) was random.
  • AppsFlyer considers a p-value < 0.10 to be significant.
  • In general, the larger the test population, the higher the statistical significance.


Component Description
Trend display
(in chart/table format)


Change the chart/table display with these controls:

  • Group by 
  • View type
    • Chart
    • Table
  • Calculated as
    • Cumulative
    • On day
  • Export: Download the table to a CSV file

Raw data reports

Use Incrementality raw data reports to analyze the interaction of users with remarketing campaigns:

  • Contents: Row-level data of users included in Incrementality experiments. 
  • Availability:
    • Via Data Locker
    • Data freshness: daily 19:00-23:00 UTC

Download Incrementality raw data example files

Available reports

Category Report name 

Data Locker folder


First-seen users


In-app events

Organic in-app events


In-app events non-organic


In-app events re-attributions





Organic sessions


Sessions non-organic


Sessions re-attributions



(not currently populated) 


Report logic

The basic logic underlying report structure is as follows:

  • Users participating in the experiment—First-seen report
    • In Audiences, rules are set characterizing users to include in the experiment.
    • When a given user, is identified as matching the rules, the event is recorded in the first_seen report. 
      • Users are allocated randomly to a test or control group indicated by the is_control_group field.
      • Users in the test group are allocated to a media source (pid_destination) for retargeting.
  • User engagement within the app: User engagement with the app during the experiment is recorded in context-specific reports: 
    • Engagement type: Session or in-app event
    • User attribution status when the user is first-seen: Organic, non-organic, re-attribution. For example, in the past, a user installed the app and was attributed to organic. As such, during an experiment, the attribution status is organic.  
  • Uninstalls: Users uninstalling the app during the experiment. Uninstall measurement must be active. 

Data characteristics and fields

Field availability varies according to report type as indicated.

Fields unique to Incrementality (per report type)
Field Description First seen In-app events Sessions Uninstalls
is_control_group If true, the user is part of the control group Y Y Y Y
pid_destination The media source the user is sent to  Y Y Y Y
audience_id Unique identifier Y Y Y Y
joined_audience_date Date user first joined the audience Y Y Y Y
audience_name Audience name (not unique)  Y Y Y Y
tm Hour of day   Y Y Y
timestamp Event time stamp YYYY-MM-DD HH:MM   Y Y  
app_ids App ids associated with the audience rules Y      


Other Incrementality report fields (per report type)
Field Display name* First seen In-app events Sessions Uninstalls
advertising_id Advertising ID (GAID) Y Y Y Y
android_id Android ID   Y Y  
app_id App ID Y Y Y Y
app_name App name   Y Y  
app_version App version   Y Y  
appsflyer_id Appsflyer ID   Y Y Y
revenue_alt App-specific currency   Y    
bundle_id Bundle ID   Y Y  
country Country code   Y Y Y
currency Currency code   Y Y Y
customer_user_id Customer user ID   Y Y  
brand Device brand   Y Y  
device_category Device category   Y Y  
model Device model   Y Y  
device_model Device model   Y Y  
device_type Device type   Y Y  
event_name Event name   Y Y Y
event_revenue Event revenue   Y    
event_revenue_currency Event revenue currency   Y    
event_revenue_u_s_d Event revenue USD   Y    
event_time Event time   Y Y Y
event_value Event value   Y Y  
idfa IDFA Y Y Y Y
idfv IDFV   Y Y  
imei IMEI   Y Y  
is_purchase_validated Is receipt validated   Y    
os_version OS version   Y Y  
platform Platform   Y Y Y
sdk_version SDK version   Y Y  
* According to raw data specification

Traits and limitations

Trait Remarks 
Ad network access Not available
Agency access Not available
Agency transparency N/A
Time zone UTC
Currency  USD
Organic data Yes
Non-organic data Yes
Data freshness

Dashboard: daily at 18:00 UTC for the previous day

Raw data reports in Data Locker: daily at 19:00-23:00 for the previous day

Historical data


Team member access Yes (according to account permissions)
Was this article helpful?