Data Locker—cloud service setup

At a glance: Set up your cloud service to receive data from Data Locker: GCS, AWS, BigQuery, or Snowflake.

Data Locker enables you to stream data to your selected and owned storage solution, whether it is a bucket or data warehouse. Set up your cloud service using one of the following procedures.

Bucket cloud storage

GCS storage

  • The procedure in this section needs to be performed by your Google Cloud admin.
  • You can delete files from Data Locker 25 or more hours after they were written. Don't delete them before.

Information for the GCS admin

Data Locker is the AppsFlyer solution for streaming data to storage.

Requirements

  • Create a bucket on GCS for the exclusive use of Data Locker. Exclusive means no other service writes data to the bucket. 
  • Suggested bucket name: af-datalocker.
  • Grant Data Locker permissions using the procedure that follows.

To grant Data Locker permissions:

In this procedure, substitute data-locker-example using the name of the bucket you previously created for Data Locker. 

  1. Sign in to your GCS console.
  2. Go to Storage > Storage browser.

    mceclip0.png

  3. Select the bucket you previously created, for example, data-locker-example
  4. Go to the Permissions tab. 
  5. Click +Add.
    The Add members window opens.
  6. Complete as follows:
    1. New members, paste the snippet that follows.
      af-data-delivery@af-raw-data.iam.gserviceaccount.com
    2. Select a role: Cloud storage > Storage Object Admin

      mceclip0.png

  7. Click Save

AWS storage

  • The procedure in this section needs to be performed by your AWS admin.
  • You can delete files from Data Locker 25 or more hours after they were written. Please don't delete them before.

Information for the AWS Admin

Data Locker is the AppsFlyer solution for streaming data to storage.

Requirements

  • Create an AWS bucket having the name af-datalocker-mybucket. The prefix af-datalocker- is mandatory. The suffix is free text.
  • We suggest af-datalocker-yyyy-mm-dd-hh-mm-free-text. Where yyyy-mm-dd-hh-mm is the current date and time, and you add any other text you want as depicted in the figure that follows.

User interface in AWS console

MyBucket.jpg

After creating the bucket, grant AppsFlyer permissions using the procedure that follows. 

To create a bucket and grant AppsFlyer permissions: 

  1. Sign in to the AWS console.
  2. Go to the S3 service.
  3. To create the bucket:
    1. Click Create bucket.
    2. Complete the Bucket name as follows: Start with af-datalocker- and then add any other text as described previously.
    3. Specify one of the supported AWS regions.
    4. Click Create bucket.
  4. To grant AppsFlyer permissions:
    1. Select the bucket. 
    2. Go to the Permissions tab. 
    3. In the Bucket policy section, click Edit. 
      The Bucket policy window opens.
    4. Paste the following snippet into the window.
      {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Sid": "AF_DataLocker_Direct",
            "Effect": "Allow",
            "Principal": {
              "AWS": "arn:aws:iam::195229424603:user/product=datalocker__envtype=prod__ns=default"
            },
            "Action": [
              "s3:GetObject",
              "s3:ListBucket",
              "s3:DeleteObject",
              "s3:PutObject"
            ],
            "Resource": [
              "arn:aws:s3:::af-datalocker-my-bucket",
              "arn:aws:s3:::af-datalocker-my-bucket/*"
            ]
          }
        ]
      }
      
  5. In the snippet, replace af-data-locker-my-bucket with the bucket name you created.

  6. [Optional] Add support for KMS encrypted buckets. To do so, in the Key Policy section, choose Switch to policy view and paste the following snippet in the Statement array.
    {
          "Sid": "Allow use of the key",
          "Effect": "Allow",
          "Principal": {
              "AWS": "arn:aws:iam::195229424603:user/product=datalocker__envtype=prod__ns=default"
          },
          "Action": "kms:GenerateDataKey*",
          "Resource": "*"
      }
    
  7. Click Save changes.
  8. Complete the Setup Data Locker procedure.

Supported AWS regions

Data Locker supports the following AWS regions:

  • Asia Pacific (Tokyo): ap-northeast-1
  • Asia Pacific (Seoul): ap-northeast-2
  • Asia Pacific (Mumbai): ap-south-1
  • Asia Pacific (Singapore): ap-southeast-1
  • Asia Pacific (Sydney): ap-southeast-2
  • Canada (Central): ca-central-1
  • EU (Frankfurt): eu-central-1
  • EU (Ireland): eu-west-1
  • EU (London): eu-west-2
  • South America (Sao Paulo): sa-east-1
  • US East (N. Virginia): us-east-1
  • US East (Ohio): us-east-2
  • US West (N. California): us-west-1
  • US West (Oregon): us-west-2

[Beta] Azure storage

  • The procedure in this section needs to be performed by your Azure Cloud admin.
  • You can delete files from Data Locker 25 or more hours after they were written. Don't delete them before.

Information for the Azure admin

Data Locker is the AppsFlyer solution for streaming data to your storage account.

To define a storage account for Data Locker:

  1. In your Azure portal, go to Azure storage accounts. and click predict_dashboard_plus_symbol.pngCreate to make a new storage account to get your Appsflyer Data.
  2. In the Basic tab -> Project details select from the dropdowns:
    1. A Subscription. 
    2. A Resource Group.
      Optional: If no resource group exists, click Create new to create a resource group.
  3. Under Instance details:
    1. Enter a Storage account name.
    2. Select a Region from the dropdown.
  4. Click Next: Advanced > 
  5. Under the  Security tab select Enable Hierarchical namespace.
  6. Click Review -> Create.
  7. After deployment has finished, go back to your Azure storage accounts, and select the newly created storage account.
  8. Go to Access keys and copy your Storage account name and one of your keys
  9. Go to Storage browser -> Blob containers and click predict_dashboard_plus_symbol.pngAdd container.
    1. Enter a Name for the new container.
    2. Click Create.
  10. Contact your CSM to enable Azure in Data Locker.
  11. Once enabled, In AppsFlyer, go to Data Locker -> predict_dashboard_plus_symbol.pngNew connection.
  12. In the new connection:
    1. Name your connection.
    2. Choose Azure Blob.
    3. Enter your Azure Container name in the Bucket Name field (the one you made in step 9).
    4. Enter your copied Storage account name from step 8.
    5. Enter your copied Key from step 8.
    6. Click Test connection.
    7. Save.

[Beta] Yandex Cloud storage

  • The procedure in this section needs to be performed by your Yandex Cloud admin.
  • You can delete files from Data Locker 25 or more hours after they were written. Don't delete them before.

Information for the Yandex admin

Data Locker is the AppsFlyer solution for streaming data to storage.

To create a bucket and grant Data Locker permissions:

  1. In your Yandex Cloud console, go to the Service Accounts tab and click Create service account.
  2. Name the service account, for example, af-datalocker.
  3. Create a static access key for the service account. In the service account:
    1. Click Create a new key.
    2. Select Create static access key.
    3. Save the Key ID and Secret Key.
  4. Give your new service account storage.editor permission.
    1. In your bucket settings, go to Access Bindings and click Assign bindings.
    2. For your new service account, add the storage.editor role and click Save.
  5. Contact your AppsFlyer CSM to enable Yandex in Data Locker.
  6. Once enabled, use the key ID and secret key during Data Locker setup.

Data warehouse cloud storage

BigQuery

The procedure in this section needs to be performed by your BigQuery admin.

Information for the BigQuery Admin

Data Locker is the AppsFlyer solution for streaming data to storage.

Requirements

  • Create a BigQuery dataset

To create a BigQuery dataset: 

  1. In BigQuery, create a project or use an existing project. 
  2. In the project, click CREATE DATASET.

    GCSCreateDataSEt_1_.png

  3. Give the dataset a suitable ID.
  4. Grant AppsFlyer access to the dataset. See Big Query instructions
  5. Complete the remaining settings as required. 

Snowflake

Connect Data Locker to your Snowflake account. By doing so, data is sent to Snowflake and continues to be available in your selected cloud storage.

Considerations for BI developers

  • The data freshness rate is the same as that of data provided in a bucket. 
  • The table and column structure of the data is equivalent to that found in the data available directly from a Data Locker bucket. 
  • As rows are added to the Snowflake share, the _ingestion_time column is populated. To ensure row uniqueness and to prevent ingestion of the same row more than once:
    1. Save the max_ingestion_time per table ingested.
    2. Each time you run your ingestion process, ingest only those rows where _ingestion_time > max_ingestion_time

Complete the procedures that follow to connect Snowflake to Data Locker. 

Snowflake connector procedures

To get your Snowflake account ID and region:

  1. In Snowflake, log in to your Snowflake account.
  2. In the menu bar, select your name.
    Your account ID and region display.

    SnowflakeAccountId.png

To connect Data Locker to Snowflake:

  1. In AppsFlyer, go to Reports > Data Locker.
  2. Select Snowflake.
  3. Enter the Snowflake region and Snowflake account ID using the information you previously got from Snowflake. 
  4. Click Save.

To create a database from a share in Snowflake:

  1. In Snowflake, log in to your Snowflake account.
  2. Switch role to Accountadmin. See Create a database from a share.
  3. Select Shares.
  4. Select the AppsFlyer share. For example, APPSFLYER_ACC_XXX_DATA_LOCKER. 
  5. Click Create Database from Secure Share, and complete the details required. Note! You must load the data from the shared database into your tables, because data in the shared database is only available for a limited period (currently 14 days).  
  6. In your database, the tables imported display. Table names and structures are equivalent to those in Data Locker buckets.