Urban Airship Responder user guide - snowplow-archive/sauna GitHub Wiki

HOME > GUIDE FOR ANALYSTS > RESPONDERS > URBAN AIRSHIP RESPONDER USER GUIDE

This responder has not yet been implemented. If you are interested in contributing this or sponsoring us to build this, please get in touch.

See also: Urban Airship Responder setup guide

Contents

1. Overview

This Sauna responder lets you export device-level data from your event warehouse and upload this data to Urban Airship ready for push notifications and other customer messaging.

2. Responder actions

Currently this responder supports one actions:

File landing path for Sauna Action performed in Optimizely
com.urbanairship/static_lists Upload one or more static lists (batch)

2.1 Upload static lists (batch)

2.1.1 Overview

This responder path lets you export device-level data from your event warehouse and upload these to Urban Airship as one or more static lists for push notifications.

Urban Airship supports lists which identify devices for messaging based on:

  1. The type of identifier you wish to upload
  2. The value of identifier you wish to upload

For more information on static lists, see the Urban Airship tutorial.

2.1.2 File landing path

The format for the path where files should land is:

<sauna_landing_root>/com.urbanairship/static_lists/v1/tsv:*/<sub_folders/...>

Notes on this:

  • Your <sauna_landing_root> will be an S3 bucket/path or local folder as appropriate
  • com.urbanairship/static_lists specifies our responder action
  • This is v1, version 1, of this responder path
  • Currently tsv (for tab-separated values) is the only supported input format
  • Currently your files must all the fields required by the static lists API, hence the tsv:*
  • You can add as many sub-folders as required to prevent clashes with other Sauna users in your organization

2.1.3 File input format

At the responder path, you must upload files which contain:

  • Tab (\t)-separated values encoded in double quotation marks (")
  • Rows separated by newlines (\n)

Fields should be as follows:

"<application_key>"\t"<list_name>"\t"<identifier_type>"\t"<identifier>"\n

Where:

  • <application_key> is the unique identifier for your application project ID, e.g. 3AkEYOGWQ2yWPS4bLOBW2P
  • <list_name> is the name of your existing static list, e.g. List_1
  • <identifier_type> is the type of this identifier, one of alias, named_user, ios_channel, android_channel, or amazon_channel
  • <identifier> is the associated identifier you want to send to

2.1.4 Response algorithm

Sauna will take the file(s) found at the file landing path and:

  1. Generate one or more in-memory CSVs for each <application_key>, <list_name> pair, where each CSV contains no more than 10 million recipients
  2. For each CSV, perform a PUT request to the api/lists/<list_name>/csv/ API endpoint, authenticated using the given <application_key>
  3. Parse the Urban Airship response to identify any failed rows and record those to the <sauna_failure_path> (out of scope for Sauna v1)

Let's go through each of these steps in turn:

2.1.4.1 Generating CSVs

Sauna will generate one or more in-memory CSVs for each <application_key>, <list_name> pair found in the file(s) in the landing area.

The transformations applied to the data are:

  1. Remove the <application_key> and <list_name> fields
  2. Replace the tabs (\t) separating <identifier_type> and <identifier> with a comma (,)

Thus the CSV lines will look like:

"<identifier_type>","<identifier>"\n

Where there are more than 10 million recipients for a given <application_key>, <list_name> pair, Sauna will split the data into multiple in-memory CSVs.

The CSV lines should be gzipped to reduce network traffic.

2.1.4.2 Making the PUT request

Sauna will make a PUT request for each group to the api/lists/{{list_name}}/csv/ API endpoint, where the list {{list_name}} must already exist. The PUT request must:

  • Be authenticated using -u "<application_key>:<masterSecret>", where <application_key> is the first column in the file(s), and <masterSecret> is found in the Urban Airship Responder's configuration file
  • Have a Accept: application/vnd.urbanairship+json; version=3 header
  • Have a Content-Type: text/csv header
  • Have a Content-Encoding: gzip header
2.1.4.3 Checking the upload status

Verification of the upload is done in two phases.

First, the PUT request will return a status code recording whether the list was accepted or an error was encountered. For the possible error codes, see the static lists API documentation.

Second, if the list upload is accepted, Sauna will then check with the API until it learns that the list has finished processing. This is done by checking the GET /api/lists/<list_name>/ endpoint and checking the status property in the returned JSON. Sauna will check this every 30 seconds until the status changes from processing to either ready or failed.

The final upload status is reported through the configured Sauna loggers.

2.1.5 Troubleshooting

We will populate this section as issues emerge.

2.1.6 Usage examples

2.1.6.1 Local folder

Assuming that your Sauna root is set to local folder /opt/sauna, and the file lands as:

/opt/sauna/com.urbanairship/static_lists/v1/tsv:*/ua-team/jane/warehouse.tsv

The contents of warehouse.tsv are as follows:

"3AkEYOGWQ2yWPS4bLOBW2P"	"List_1"	"named_user"	"named_user_id_1"
"3AkEYOGWQ2yWPS4bLOBW2P"	"ios_list"	"ios_channel"	"1a3c39bc-97f4-4xd8-akz8-e37f11b4bfd8"
"2BcDIOGZO9dJOS5bYOPW1Z"	"List_1"	"amazon_channel"	"7eeb8c82-3369-4da9-b16d-dadf71e0ab7a"

Here we can see that Jane in the User Acquisition team wants to upload three recipients to Urban Airship.

2.1.6.2 Redshift UNLOAD

UNLOAD is a Redshift SQL command which lets you export the contents of a SELECT statement to Amazon S3 for further processing.

Assuming that your Sauna root is set to S3 folder s3://my-sauna-bucket/prod and you run the following SQL:

CREATE TABLE push_recipients (
  application_key varchar(22)  NOT NULL,
  list_name       varchar(256) NULL,
  identifier_type varchar(15)  NOT NULL,
  identifier      varchar(256) NOT NULL
);

INSERT INTO push_recipients VALUES
  ('3AkEYOGWQ2yWPS4bLOBW2P', 'List_1', 'named_user', 'named_user_id_1'),
  ('3AkEYOGWQ2yWPS4bLOBW2P', 'ios_list', 'ios_channel', '1a3c39bc-97f4-4xd8-akz8-e37f11b4bfd8'),
  ('2BcDIOGZO9dJOS5bYOPW1Z', 'List_1', 'amazon_channel', '7eeb8c82-3369-4da9-b16d-dadf71e0ab7a');

UNLOAD ('select application_key, list_name, identifier_type, identifier')
  TO 's3://my-sauna-bucket/prod/com.urbanairship/static_lists/v1/tsv:*/ua-team/jane/'
  CREDENTIALS 'aws_access_key_id=<access-key-id>;aws_secret_access_key=<secret-access-key>'
  DELIMITER AS '\t'
  ADDQUOTES
  PARALLEL OFF;

Again, this will export 3 recipients in a TSV to Amazon S3, where Sauna is waiting to detect this file landing and upload to Urban Airship. Some notes:

  • The order of our fields in UNLOAD ('select must match the order given in the tsv:... file landing path
2.1.6.3 Implementation

From the example above, Sauna will make three PUT calls to Urban Airship, populating three pre-created lists.

Here are the exact PUT calls as they would be run at the command-line - the first:

$ echo "named_user"\\t"named_user_id_1" | gzip -cf > one.csv.gz
$ curl https://go.urbanairship.com/api/lists/List_1/csv/
   -X PUT \
   -u "3AkEYOGWQ2yWPS4bLOBW2P:<masterSecret>" \
   -H "Accept: application/vnd.urbanairship+json; version=3" \
   -H "Content-Type: text/csv" \
   -H "Content-Encoding: gzip" \
   --data-binary @one.csv.gz

The second:

$ echo "ios_channel"\\t"1a3c39bc-97f4-4xd8-akz8-e37f11b4bfd8" | gzip -cf > two.csv.gz
$ curl https://go.urbanairship.com/api/lists/ios_list/csv/
   -X PUT \
   -u "3AkEYOGWQ2yWPS4bLOBW2P:<masterSecret>" \
   -H "Accept: application/vnd.urbanairship+json; version=3" \
   -H "Content-Type: text/csv" \
   -H "Content-Encoding: gzip" \
   --data-binary @two.csv.gz

And finally the third:

$ echo "amazon_channel"\\t"7eeb8c82-3369-4da9-b16d-dadf71e0ab7a" | gzip -cf > three.csv.gz
$ curl https://go.urbanairship.com/api/lists/List_1/csv/
   -X PUT \
   -u "2BcDIOGZO9dJOS5bYOPW1Z:<masterSecret>" \
   -H "Accept: application/vnd.urbanairship+json; version=3" \
   -H "Content-Type: text/csv" \
   -H "Content-Encoding: gzip" \
   --data-binary @three.csv.gz

Note that although this PUT is to a list called List_1, this is a distinct list from the first PUT to List_1, because static lists are scoped to an individual app, as specified in the authentication string (-u).

⚠️ **GitHub.com Fallback** ⚠️