[Closed beta] Side-by-side view of SKAN and AppsFlyer campaign metrics

At a glance: Display metrics of SKAdNetwork (SKAN) and AppsFlyer attribution frameworks side-by-side in the Overview V2.0 dashboard. Identify the framework having the most suited campaign performance metric, then leverage it to gain insights and optimize campaigns. 


Attribution metrics display side-by-side in the Overview V2.0 dashboard


Finding the best iOS 14+ campaign performance metrics

Staring iOS 14.5, campaign performance metrics are generated by both SKAN and AppsFlyer attribution frameworks. The frameworks each have their own traits and limitations, which impact the total reported metrics and cause discrepancies. For example, SKAN doesn't contain metrics of Apple Search Ads campaigns. Conversely, AppsFlyer metrics from SRNs don't include installs of non-consenting users. Furthermore, SKAN measures performance during an initial post-install window of 1-3 days, whereas AppsFlyer measures lifetime (LTV). 

In side-by-side, you view campaign performance using data provided by both frameworks. Compare the metrics of each framework and decide which metric best represents the actual campaign performance. 

What campaign insights can you find in the side-by-side view? 

  • Precision:
    • How many actual installs can be attributed to a campaign?
    • Why do the SKAN and AppsFlyer metrics differ? 
  • Effectiveness: How many users converted, and what is the ROI?
  • Efficiency: What is the campaign eCPI?

This article contains guidance based on our market knowledge and observations on how to interpret the data and gain actionable insights from it. Consider that the marketing landscape of each app is different;  validate our guidance in the context of your app. 


  • Install:
    • AppsFlyer: New installs and re-attributions
    • SKAN:  New installs and re-downloads
  • Actual: Actual means the metric that best represents the data, even though it may not include all data. 
  • Consenting user: An iOS 14+ user who grants ATT authorization in both the publisher and advertised app. 

Campaign precision

Question How many actual installs can be attributed to a campaign? 

Install metrics differ between the frameworks. Each framework counts some of the installs, and there can be overlaps and discrepancies between the frameworks.

  • Compare the install metrics of the frameworks at the campaign level. 
  • The framework reporting the most installs is regarded as having the most precise metric, even though this metric may not include 100% of attributions.


In the table, the framework bringing the most installs is highlighted. This is the actual number of installs brought by the campaign.


You can automate the process of establishing which metric is higher by importing the table into a spreadsheet. Use the @max function and conditional formatting to identify the framework reporting the most installs.


Why do the SKAN and AppsFlyer metrics differ? 

Symptom  Install metrics differ.

The AppsFlyer framework determines installs according to the AppsFlyer attribution model covering different campaign types and attribution methods. But SKAN measures installs in the context of paid app-to-app campaigns. 

Typically, SKAN metrics of SRN campaign installs are higher than those of AppsFlyer, because AppsFlyer only records consenting users.

In the case of non-SRN networks (click networks), there is often a much closer alignment between the metrics per framework.

Campaign effectiveness


How many users converted, and what is the ROI?

  • Determining revenue and conversion rate metrics requires that user performance be measured over time. 
  • SKAN limits measurement to 12-72 hours post-install, whereas AppsFlyer measurement is unlimited (LTV).
  • AppsFlyer attribution contains metrics of consenting users. 

Consider whether the AppsFlyer attribution metrics accurately represent the entire user population (including users reported via SKAN). If so, use them to extrapolate campaign effectiveness metrics like ROI and conversion rate metrics.

To determine the number of users that converted: 

  1. Use the AppsFlyer conversion rate metric (per event). 
  2. Multiply the conversion rate by the actual number of users derived in the campaign precision section. 

To calculate campaign revenue:

  1. Derive the actual number of users (a) as described in the campaign precision section.
  2. Use the ARPU (b) metric provided by AppsFlyer attribution. 
  3. The formula a * b = campaign revenue.
  4. To derive ROI, use the campaign cost metric, and subtract the campaign cost from the campaign revenue. 

Calculating ROI


Campaign revenue calculation:

  • Campaign a: 4095 * 0.12 = $491.40
  • Campaign b: 2318 * 0.05 = $115.90

Use your BI product analysis tool to compare the performance of all Consenting users to the entire app user population. If the results differ, you must adjust the remedy calculation to account for this.

Campaign efficiency

Question What is the actual campaign eCPI?
Symptom  Campaign cost-related metrics like eCPI are misleading because they rely on a precise number of users.
  1. Derive the actual number of users (a) as described in the campaign precision section.
  2. Calculate eCPI using the following formula: eCPI = cost / (a)


  • Actual installs is determined using the framework having the most installs.
  • eCPI is calculated using the formula: cost/number of Installs. 

Using side-by-side


  • Contains installs reported by both SKAN and AppsFlyer frameworks
  • Use the unified view (in Overview V2.0) to ensure that all available data in the context of a campaign displays and isn't hidden inadvertently by filtering. 
  • SKAN data is the same as that available via the SKAN dashboard. 
  • To reduce discrepancies due to timing differences between the frameworks, set a date range as follows:
    • Duration: Minimum 7 days
    • Most recent date: 3–5 or 8 days before the current date. The range depends on your activity window and if you run campaigns on Google Ads. 

Time-related differences

Difference AppsFlyer SKAN
Historical data

Available; no limitation

Available; no limitation. 

Time zone

App-specific time zone

Data freshness  Updates continuously  Daily. Dashboard updates daily by 08:00 UTC. 
Install date Date the install occurred  Derived from the postback arrival time. (1) 
Measurement duration Unlimited; LTV based

Limited to the duration of the activity window of 12-72 hours. Default = 24 hours.

Attribution lookback window
  • Determined by advertiser or SRN
  • Click-to-open: 7 days
  • Determined by Apple
  • Click-to-download: 30 days
  • Download-to-open: 60 days


(1) Google Ads data updates with a delay of up to 8 days.

Cost, clicks, and impressions

Ad networks provide cost, click, and impression metrics to AppsFlyer which are common to both frameworks. There is no segregation by framework of these metrics. 

In the case of cost metrics, the following apply:

  • Updates 6 times a day, on average every 4 hours. 
  • AppsFlyer records the first engagement (click or impression) associated with a campaign; thereafter, the campaign cost associated with the engagement displays. 
  • Because SKAN installs are reported 72–120 hours after the first app open, expect to see cost metrics some days before they are associated with installs.
  • For installs from Google Ads, the delay can be up to 8 days. 

Data scope differences

Difference  AppsFlyer  SKAN
Platform All  iOS 
Attribution types
  • Installs
  • Re-engagements
  • Re-attributions
  • Installs
  • Re-downloads
Data freshness rate Up to 70 minutes Depends on the duration of the activity window. Default: 2-5 days; in the case of Google Ads campaigns up to 8 days. 
Ad revenue Available Limited; depends on the mediation partner.
Apple Search Ads Available Not applicable to SKAN attribution
Owned media Available

Not applicable to SKAN attribution

Web-to-app Available Not applicable to SKAN attribution
Currency App-specific currency USD

Matching considerations

Consideration AppsFlyer SKAN
Agency transparency Supported

Attributed to the media source bringing the install. Agencies can instruct media sources to prefix campaign names with their prn, in which case you can identify agency-driven traffic.

Campaign name changes
  • Supported
  • The most recent name reported by the ad network determines the campaign name used for all installs associated with a given campaign ID. This includes previously recorded installs. 
  • Not supported
  • The campaign name reported when the user installs the app is used. This doesn't change over time. Consequently, multiple names can be associated with a given campaign ID. 
Dimensions supported No limitation

Available: Geo, ad, ad ID, channel, adset, site ID.

Agency: Not currently.

If you select a dimension that isn't supported, SKAN data is displayed as none.

SRN install metrics Limited to consenting users; as a result, expect the AppsFlyer-reported metrics to be lower than SKAN-reported metrics, and the metrics provided by the SRN itself.  All SKAN attributed installs. Usually higher than the AppsFlyer metric. 
Hidden SKAN campaigns

The table in the dashboard contains a limited number of rows. The remaining rows are combined in the row designated others.

To view a breakdown of this row, export the entire table to a spreadsheet. 


Was this article helpful?