AzureDevOpsPipelineNotes - henk52/knowledgesharing GitHub Wiki
Azure DevOps Pipeline Notes
Introduction
Purpose
Notes on how to use the Azure DevOps pipeline
References
Pipeline
trigger:
branches:
include:
- '*' # build all branches
paths:
exclude:
- docs/* # exclude the docs folder
- A continuous integration (CI) build is a build that runs when you push a change to a branch.
- A pull request (PR) build is a build that runs when you open a pull request or when you push additional changes to an existing pull request.
Pipeline resources
Pipeline resources: git Repositories
resources:
repositories:
- repository: PIPELINE_REFERENCE_NAME
type: git
name: GIT_REPO_NAME
ref: BRANCH_REFERENCE
- BRANCH_REFERENCE:
- Default: 'refs/heads/master'
- -xxx
- GIT_REPO_NAME: Name of the repository in git
- PIPELINE_REFERENCE_NAME: Name used for referencing the repository in the pipeline.
Pipeline resources: trigger on repository updates
Pools
Referencing pools
Referencing pool by image
Referencing pool by name
pool:
name: integration-stage
Create a self-hosted pool
- login to
- Click your project
- Click on 'Setting' gear at the bottom left
- Click on Agent pools, under Pipelines, in the left panel
- Click 'Add pool'
- Pool to link: New
- Pool type: self-hosted
rest of pipeline
Stage, jobs etc
Hierarchy:
- stages:
- stage:
- jobs:
- job | deployment :
- steps:
- bash | checkout | powershell | pwsh | script | task | templateReference :
- steps:
- job | deployment :
- jobs:
- stage:
Full job syntax:
-
- job: string # name of the job, A-Z, a-z, 0-9, and underscore
- displayName: string # friendly name to display in the UI
- dependsOn: string | [ string ]
- condition: string
- strategy:
- parallel: # parallel strategy
- matrix: # matrix strategy
- maxParallel: number # maximum number simultaneous matrix legs to run
- continueOnError: boolean # 'true' if future jobs should run even if this job fails; defaults to 'false'
- pool: pool # see pool schema
- workspace:
- clean: outputs | resources | all # what to clean up before the job runs
- container: containerReference # container to run this job inside
- timeoutInMinutes: number # how long to run the job before automatically cancelling
- cancelTimeoutInMinutes: number # how much time to give 'run always even if cancelled tasks' before killing them
- variables: { string: string } | [ variable | variableReference ]
- steps: [ script | bash | pwsh | powershell | checkout | task | templateReference ]
- services: { string: string | container } # container resources to run as a service container
-
Each agent can run only one job at a time. To run multiple jobs in parallel you must configure multiple agents. You also need sufficient parallel jobs.
Variables in Azure pipelines
-
When you define the same variable in multiple places with the same name, the most locally scoped variable wins.
- So, a variable defined at the job level can override a variable set at the stage level.
- See: Define variables
-
Variables are different from runtime parameters, which are typed and available during template parsing.
-
When you define a variable, you can use different syntaxes:
- macro
- template expression
- runtime
- and what syntax you use will determine where in the pipeline your variable will render.
-
You can use a variable group to make variables available across multiple pipelines.
-
You can use templates to define variables that are used in multiple pipelines in one file.
-
Variables are only expanded for stages, jobs, and steps.
- You cannot, for example, use macro syntax inside a resource or trigger.
-
User-defined variables
-
System variables
-
Environment variables
- The name is upper-cased, and the '.' is replaced with the '_'
-
naming restrictions
- User-defined variables can consist of
- letters
- numbers
- .
- _
- Don't use variable prefixes that are reserved by the system.
- These are:
- endpoint
- input
- secret
- securefile
- Any variable that begins with one of these strings (regardless of capitalization) will not be available to your tasks and scripts.
- These are:
- User-defined variables can consist of
-
Azure Pipelines supports three different ways to reference variables: Define variables
- macro -
$(var)
get processed during runtime before a task runs.- Variables with macro syntax get processed before a task executes during runtime.
- Runtime happens after template expansion.
- When the system encounters a macro expression, it replaces the expression with the contents of the variable.
- If there's no variable by that name, then the macro expression is left unchanged.
- It seems like this happens os $(ENV_VAR) can refer to an OS or Environment variable?
- If there's no variable by that name, then the macro expression is left unchanged.
- Macro variables are only expanded when they are used for a value, not as a keyword.
- Values appear on the right side of a pipeline definition.
- The following is valid: key: $(value).
- The following isn't valid: $(key): value.
- Macro variables are not expanded when used to display a job name inline.
- Instead, you must use the displayName property.
- Values appear on the right side of a pipeline definition.
- Variables with macro syntax get processed before a task executes during runtime.
- template expression -
${{ variables.var }}
get processed at compile time, before runtime starts.- You can use template expression syntax to expand both template parameters and template variables (${{ variables.var }}).
- Template variables are processed at compile time, and are replaced before runtime starts.
- Template expressions are designed for reusing parts of YAML as templates.
- unlike macro and runtime expressions, can appear as either keys (left side) or values (right side).
- The following is valid: ${{ variables.key }} : ${{ variables.value }}.
- You can use template expression syntax to expand both template parameters and template variables (${{ variables.var }}).
- runtime expression -
$[variables.var]
get processed during runtime but; designed for use with conditions and expressions.- The runtime expression must take up the entire right side of a key-value pair.
- For example,
- key: $[variables.value] is valid but key:
- $[variables.value] foo is not.
- For example,
- Runtime expression variables are only expanded when they are used for a value, not as a keyword.
- Values appear on the right side of a pipeline definition.
- The following is valid:
- key: $[variables.value].
- The following isn't valid:
- $[variables.key]: value.
- The following is valid:
- The runtime expression must take up the entire right side of a key-value pair.
- macro -
-
What to use when:
- Use macro syntax if you are providing input for a task.
- Choose a runtime expression if you are working with conditions and expressions.
- The exception to this is if you have a pipeline where it will cause a problem for your empty variable to print out.
- For example, if you have conditional logic that relies on a variable having a specific value or no value.
- In that case, you should use a runtime expression.
- The exception to this is if you have a pipeline where it will cause a problem for your empty variable to print out.
- If you are defining a variable in a template, use a template expression.
-
Variable scopes
- In the YAML file, you can set a variable at various scopes:
- At the root level, to make it available to all jobs in the pipeline.
- At the stage level, to make it available only to a specific stage.
- At the job level, to make it available only to a specific job.
- When a variable is defined at the top of a YAML, it will be available to all jobs and stages in the pipeline and is a global variable. Global variables defined in a YAML are not visible in the pipeline settings UI.
- In the YAML file, you can set a variable at various scopes:
Specifying variables in a pipeline
variables:
# a regular variable
- name: myvariable
value: myvalue
# a variable group
- group: myvariablegroup
# a reference to a variable template
- template: myvariabletemplate.yml
- The group is defined in the 'Library' entry of the 'Pipelines' icon in dev.azure.com, for your project.
Use output variables from tasks
Some tasks define output variables, which you can consume in downstream steps, jobs, and stages. In YAML, you can access variables across jobs and stages by using dependencies.
How to provide a default for pipeline pipeline run-time variable
trigger:
- none
variables:
- group: DEV_LIB_GRP_NAME
- name: one
value: initialValue
- name: a
value: $[counter(format('{0:yyyyMMdd}', pipeline.startTime), 100)]
- name: coalesceLiteral
value: $[coalesce(variables.emptyString, '', 'literal value')]
- name: testFunctionType
value: $[coalesce(variables.VAR_FUNCTION_TYPE,'-persistent')]
steps:
- script: |
echo ${{ variables.one }} # outputs initialValue
echo $(one)
displayName: First variable pass
- bash: echo '##vso[task.setvariable variable=one]secondValue'
displayName: Set new variable value
- script: |
echo ${{ variables.one }} # outputs initialValue
echo $(one) # outputs secondValue
displayName: Second variable pass
- script: |
echo $(a)
displayName: test set var
- script: |
echo $(coalesceLiteral)
echo $(testFunctionType)
displayName: coalesceLiteral
Log investigation in Azure pipelines
Creating tasks
DevOps agents
Installing an Azure agent on an Ubuntu machine
See: Create a self-hosted pool
- Create PAT
- cd ~/Downloads
wget https://vstsagentpackage.azureedge.net/agent/2.173.0/vsts-agent-linux-x64-2.173.0.tar.gz
- mkdir ~/myagent
- cd ~/myagent
- tar zxvf ~/Downloads/vsts-agent-linux-x64-*.tar.gz
export AZP_URL="https://dev.azure.com/YOUR_ORG"
- ./config.sh --unattended --agent "$(hostname)" --url "$AZP_URL" --auth PAT --token "PAT_TOKEN" --pool "POOL_NAME" --replace --acceptTeeEula
- sudo ./svc.sh install
- sudo ./svc.sh start
- go to the agent pool and ensure the agent is now marked as 'online'
walinuxagent, didn't get it to work See:
- sudo apt install puppet
- systemctl status walinuxagent
Azure access from python scripts
The python scripts seems to be wrappers for the REST API into Azure.
references
Vocabulary
- UAI - User Assigned Identifier
Aszure keyvault access from python
See:
Access keyvault using credentials, not safe for use with scripts
- pip install azure-identity
- pip install azure-keyvault-secrets
- sudo apt install python3-gi python3-gi-cairo gir1.2-secret-1
Creating a service principal
The output includes credentials that you must protect. Be sure that you do not include these credentials in your code or check the credentials into your source control.
az ad sp create-for-rbac --skip-assignment -n APP_NAME
Changing "APP_NAME" to a valid URI of "http://APP_NAME", which is the required format used for service principal names
{
"appId": "APPLICATION_ID",
"displayName": "APP_NAME",
"name": "http://APP_NAME",
"password": "SECRET",
"tenant": "TENANT_ID"
}
Web interface:
- go to keyvault
- click: Access control (IAM)
- Click '+ Add'
- Select Add Roleassignement
- Select role
- Input the name in 'Select' entry line
- click save
sudo pip install azure-keyvault-secrets azure-identity
export AZURE_CLIENT_ID="APPLICATION_ID"
export AZURE_CLIENT_SECRET="SECRET"
export AZURE_TENANT_ID="TENANT_ID"
export AZURE_KEY_VAULT_NAME="KEY_VAULT_NAME"
az keyvault set-policy --name ${AZURE_KEY_VAULT_NAME} --spn ${AZURE_CLIENT_ID} --secret-permissions get
Accessing the keyvault from a script, using tokens
- Follow:
- Add the access policy to the UAI
-
- access the portal.azure.com
-
- go to the keyvault
-
- Click: Add policy
-
- for 'secret' allow get and list
-
- Click principal
-
- look up 'YOUR_AUI_USER_NAME'
-
- Click 'Select'
-
- REMEMBER to 'save'
-
- az vm identity assign -g MY_RESOURCE_GROUP -n VM_NAME --identities UAI_ID
- Verify that you can get a token:
curl -XGET 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https://management.azure.com/' -H Metadata:true
- Installing the modules on the VM
- Please note it seems that on Ubuntu 18.04 'python' defaults to python2 and on 20.04 'python' defaults to python3.
-
- sudo apt install
-
- sudo pip install azure-keyvault-secrets azure-identity
import os
import cmd
from azure.keyvault.secrets import SecretClient
from azure.identity import ManagedIdentityCredential
keyVaultName = os.environ["KEY_VAULT_NAME"]
# The 'Application ID' of the UAI
clientId = os.environ["AZURE_APP_ID"]
KVUri = "https://" + keyVaultName + ".vault.azure.net"
# See: https://azuresdkdocs.blob.core.windows.net/$web/python/azure-identity/1.4.0/azure.identity.html#azure.identity.ManagedIdentityCredential
credential = ManagedIdentityCredential(client_id = clientId)
# https://docs.microsoft.com/en-us/dotnet/api/azure.security.keyvault.secrets.secretclient?view=azure-dotnet
client = SecretClient(vault_url=KVUri, credential=credential)
secretName = "MY_SECRET_TO_RETRIEVE"
retrieved_secret = client.get_secret(secretName)
print("Your secret is '"+retrieved_secret.value+"'.")
print(" done.")