Call Guzzle from Azure Data Factory - ja-guzzle/guzzle_docs GitHub Wiki
Click on Connections:
Click on New:
Choose HTTP Linked Service:
Choose integration runtime. Enter Base URL as Guzzle API URL. If you are using valid SSL certificates (issued by certificate authority) for running Guzzle API then enable Server Certificate Validation. Choose Basic Authentication Type. Enter username and password for native Guzzle user (credentials of Azure AD will not work). Click on Create once it is done.
Create new dataset:
Create dataset with relative URL as: /api/execute/job
Choose HTTP type of dataset:
Choose format as JSON:
Choose Guzzle HTTP Linked Service which was created earlier. Also Specify relative url of Guzzle API
Add Lookup Activity in ADF Pipeline and configure Settings tab like following:
Request body json is like following:
{
"name": "dq1",
"jobParameters": {
"system": "test",
"business_date": "2020-01-08 18:31:10",
"guzzle.spark.name": "local1",
"environment": "test"
}
}
When we debug pipeline, we will get job instance id of triggered job in response of API call:
Create dataset with relative URL as: /api/batches/run_job_group
Use lookup activity to call API (using HTTP linked service) which triggers job group with following settings:
Request Method: POST
Additional Headers: Content-Type: application/json
Request Body:
{
"system": "test",
"location": "SG",
"business_date": "2020-04-02 16:01:27",
"guzzle.spark.name": "local1",
"job_group": "USERS_INGESTION",
"environment": "test2"
}
When we debug pipeline, we will get job instance id of triggered job group in response of API call (similar to job trigger API)
Create dataset with relative URL as: /api/batches/initialize
Use lookup activity to call API (using HTTP linked service) which initializes batch record with following settings:
Request Method: POST
Additional Headers: Content-Type: application/json
Request Body:
{
"contextParams": {
"system": "sp"
},
"businessDate": "2020-04-02 16:11:38",
"environment": "test2"
}
When we debug pipeline, we will not get anything in response of the API call
Create dataset with relative URL as: /api/batches/initialize
Use lookup activity to call API (using HTTP linked service) which initializes batch record with following settings:
Request Method: POST
Additional Headers: Content-Type: application/json
Request Body:
{
"contextParams": {
"system": "sp",
"location": "all"
},
"businessDateRange": {
"startDate": "2020-04-02 16:00:00",
"endDate": "2020-04-06 16:00:00"
},
"period": "1",
"environment": "test2"
}
When we debug pipeline, we will not get anything in response of the API call
Create dataset with relative URL as: /api/batches/run_stage
Use lookup activity to call API (using HTTP linked service) which triggers stage with following settings:
Request Method: POST
Additional Headers: Content-Type: application/json
Request Body:
{
"param1": "value1",
"system": "sp",
"guzzle.spark.name": "local1",
"stage": "STG",
"environment": "test2"
}
When we debug pipeline, we will not get anything in response of the API call
Create dataset with relative URL as: /api/batches/run_stage
Use lookup activity to call API (using HTTP linked service) which triggers stage with following settings:
Request Method: POST
Additional Headers: Content-Type: application/json
Request Body:
{
"param1": "value1",
"system": "sp",
"guzzle.spark.name": "local1",
"stage": "STG,FND,PLP",
"environment": "test2"
}
When we debug pipeline, we will not get anything in response of the API call