jenkins jenkinsfile Declarative pipeline - ghdrako/doc_snipets GitHub Wiki

Jenkins supports a type of script called a declarative pipeline script that allows a concise definition of steps needed to build, test, and deploy software. This script is conventionally known as a Jenkinsfile.

Because these scripts are written using the Groovy language (see https://groovy-lang.org/), you can declare variables, write functions, and use many features of this very powerful language to help you build and deploy your software. Jenkins supports both a free-form scripting style and a more structured declarative style of script that uses a special Groovy DSL to provide more scaffolding for concise scripts.

https://www.jenkins.io/doc/book/pipeline/jenkinsfile/

In declarative syntax, you cannot use Groovy code such as variables, loops, or conditions. You are restricted to the structured sections/blocks and the DSL (Jenkins domain-specific language) steps.

All avaliabble blocks: www.jenkins.io/doc/book/pipe line/syntax/#stages

Each pipeline must use these predefined block attributes or sections:

  • agent
  • environment
  • post
  • stages
  • steps

Pipeline

Pipeline is a block in which we define the complete execution process, which typically includes information like the following:

  • Which node to use for workflow/pipeline execution
  • Different stages’ information
pipeline {
  agent any
}

agent

The agent section defines where the whole Pipeline, or a specific stage, runs. It is positioned right at the beginning of the pipeline {} block and is compulsory. Additionally, you can also use it inside the stage directive, but that’s optional.

The agent section allows specific parameters to suit various use-cases.

  • Any - run on any agent available in Jenkins
  • None - inside your pipeline {} block to stop any global agent from running your pipeline. In such a case, each stage directive inside your pipeline must have its agent section.
pipeline {
    agent none
    stages {
        stage('Parallel Testing') {
            parallel {
                stage('Test with Firefox') {
                    agent {
                        label 'Windows2008_Firefox'
                    }
                    steps {
                        echo "Testing with Firefox."
                    }
                }
                stage('Test with Chrome') {
                    agent {
                        label 'Windows2008_Chrome'
                    }
                    steps {
                        echo "Testing with Chrome."
                    }
                }
            }
        }
    }
}

  • Label - Run your complete pipeline, or stage, on an agent available in Jenkins with a specific label
agent {
    label 'windows2008_Chrome'
}
pipeline{
    agent {
        node {
            label 'workers'
        }
 
        dockerfile {
            filename 'Dockerfile'
            label 'workers'
        }
 
        kubernetes {
            label 'workers'
            yaml """
            kind: Pod
            metadata:
            name: jenkins-worker
            spec:
            containers:
            - name: nodejs
              image: node:lts
              tty: true
            """
        }
    }
}
  • Node Node block is used to define the host where Jenkins pipeline/workflow will be executed. By default, it is “master” but can be set to any other host as well.

Run your complete pipeline, or stage, on an agent available in Jenkins with a specific label. The behavior is similar to the Label parameter. However, the Node parameter allows you to specify additional options such as custom Workspace. Workspace, or Jenkins Pipeline workspace, is a reserved directory on your Jenkins agent, where all your source code is downloaded and where your builds are performed.

pipeline {
    agent {
        node {
            label 'ubuntu_maven'
            customWorkspace '/some/path'
        }
    }
    stages {
        stage('Build') {
             steps {
                sh 'mvn -B clean verify'
             }
        }
    }
}

  • docker Run your complete Pipeline, or stage, inside a docker container. The container will be dynamically spawned on a pre-configured docker host. The docker option also takes an args parameter that allows you to pass arguments directly to the docker run command.
agent {
    docker {
        image 'ubuntu:trusty'
        label 'ubuntu_maven'
        args  '-v /tmp:/home/dev'
    }
}
  • dockerfile Run your Pipeline, or stage, inside a container built from a dockerfile . To use this option, the Jenkinsfile and the dockerfile should be present inside the root of your source code.
agent {
    dockerfile true
}
agent {
    dockerfile {
        dir 'someSubDir'
    }
}
agent {
    dockerfile {
        filename 'Dockerfile.maven'
        dir 'someSubDir'
    }
agent {
    // Equivalent to "docker build -f Dockerfile.maven --build-arg version=0.0.7 ./someSubDir/
    dockerfile {
        filename 'Dockerfile.maven'
        dir 'someSubDir'
        additionalBuildArgs  '--build-arg version=0.0.7'
    }
}

The agent section defines the worker or machine where the pipeline will be executed. This section must be defined at the top level inside the pipeline block or overridden at the stage level. The agent can be any of the following:

  • Jenkins worker or node
  • Docker container based on a Docker image or a custom Dockerfile
  • Pod deployed on a Kubernetes cluster

environment

The environment section contains a set of environment variables needed to run the pipeline steps. The variables can be defined as sequences of key-value pairs. These will be available for all steps if the environment block is defined at the pipeline top level; otherwise, the variables can be stage- specific. You can also reference credential variables by using a helper method credentials()

pipeline{
    environment {
        REGISTRY_CREDENTIALS= credentials('DOCKER_REGISTRY')
        REGISTRY_URL = 'https://registry.domain.com'
    }
    stages {
        stage('Push'){
            steps{
                sh 'docker login $REGISTRY_URL --username
                $REGISTRY_CREDENTIALS_USR --password $REGISTRY_CREDENTIALS_PSW'
            }
        }
    }
}

The Docker registry username and password are accessible automatically by referencing the REGISTRY_CREDENTIALS_USR and REGISTRY_CREDENTIALS_PSW environment variables. Those credentials are then passed to the docker login command to authenticate with the Docker Registry before pushing a Docker image.

post

The post section contains commands or scripts that will be run upon the completion of a pipeline or stage run, depending on the location of this section within the pipeline.

The pipeline build status can be fetched by using either the currentBuild.result variable or the post-condition blocks always, success, unstable, failure, and so forth.

pipeline{
    post {
        always {
            echo 'Cleaning up workspace'
        }
        success {
            slackSend (color: 'GREEN', message: \
                   "${env.JOB_NAME} Successful build")
        }
        failure {
            slackSend (color: 'RED', message: "${env.JOB_NAME} Failed build")
        }
    }
}
Condition Description
always Run the steps in the post section in every case.
changed Run the steps in the post section only if the completion status of your current Pipeline or stage run is different from its previous run.
fixed Run the steps in the post section only if your current Pipeline or stage run is successful and the previous run failed or was unstable.
regression Run the steps in the post section only if your current Pipeline or stage status is a failure, unstable, or aborted, and the previous run was successful.
aborted Run the steps in the post section only if your current Pipeline or stage has an “aborted” status.
failure Run the steps in the post section only if your current Pipeline or stage has failed.
success Run the steps in the post section only if your current Pipeline or stage is successful.
unstable Run the steps in the post section only if the current Pipeline or stage run is unstable, usually caused by test failures, code

stages

A stage is a sequence of steps in a pipeline.

The stages section is the core of the pipeline. This section defines what is to be done at a high level. It contains a sequence of more stage directives for each discrete part of the CI/CD workflow. Finally, the steps section contains a series of more steps to be executed in a given stage directive.

pipeline{
    agent any
    stages {
        stage('Test'){
            steps {
                sh 'npm run test'
                sh 'npm run coverage'
            }
        }
    }
}

The listing defines a Test stage with instructions to run unit tests and generate code coverage reports.

You can either directly enter these scripts into a Jenkins job definition or store them in version control. If you put a file called Jenkinsfile in the root of a version control repository, Jenkins can discover those files if it gets configured to talk to a version control system such as GitHub.

Steps

The steps section contains one or more steps that should run inside a stage directive. The steps section is used to segregate a single or a collection of steps from the rest of the code inside a stage directive.

steps {
  sh “make world”
  echo “Hello World”
}

Parallel

Parallel is an in-built function of a pipeline that we use to run nested stages parallelly. As we know, Jenkins stages get executed sequentially, and they are dependent on each other. But in some cases, you may need to run the stages parallel to reduce the execution time of Jenkins ‘Job’. In such cases, we can use similar functions, for example: in continuous integration, we can use parallel execution for unit testing and code coverage stages. The parallel block is supported in both types of the pipeline, i.e., scripted and declarative:

parallel {
  stage (“Unit testing”) {
    echo “Unit Testing Execution”
  }
  stage(“Code Coverage”) {
    echo “Code Coverage Execution”
  }
}

Directives

Directives are supporting actors that give direction, set conditions, and provide assistance to the steps of your pipeline in order to achieve the required purpose.

Enviroment

Enviroment directives describe earlier

Options

The options directive allows you to define pipeline-specific options from within the Pipeline. Jenkins comes with a few core options. Additionally, new options are available through plugins.

  • buildDiscarder Allows you to keep a certain number of recent builds runs.
// Will keep the last 10 Pipeline logs and artifacts in Jenkins.
options {
    buildDiscarder(logRotator(numToKeepStr: '10'))
}
  • disableConcurrentBuilds Disallows concurrent executions of the Pipeline.
options {
    disableConcurrentBuilds ()
}
  • newContainerPerStage When specified, each stage runs in a new container on the same docker host, as opposed to all stages utilizing the same container. This option only works with an agent section defined at the top-level of your Pipeline {} block.

  • preserveStashes In Jenkins, you can pass artifacts between stages. You do it using stash. The preserveStashes option allows you to preserve stashes of the completed Pipelines. It’s useful if you have to re-run a stage from a completed Pipeline.

// Preserve stashes from the most recent completed Pipeline.
options {
    preserveStashes()
}
//  Preserve stashes from the ten most recent completed Pipelines.
options {
    preserveStashes(10)
}
  • Retry The retry option allows you to retry a Pipeline, or a stage, on failure.
# Retry an Entire Pipeline Three Times, On Failure:
pipeline {
    agent any
    options {
        retry(3)
    }
    stages {
        stage ('say hello') {
            steps {
                echo 'Hello everyone'
            }
        }
    }
}
# retry a stage 3 times, on failure 
pipeline {
  agent any
  stages {
        stage ('say hello') {
            options {
                retry(3)
            }
            steps {
                echo 'Hello everyone'
            }
        }
    }
}
  • skipDefaultCheckout In Jenkins Declarative Pipeline, the source code by default gets checked out in every stage directive of a Pipeline. Use this option to skip checking out source code in a given stage directive.
pipeline {
    agent any
    stages {
        stage ('say hello') {
            options {
                skipDefaultCheckout()
            }
            steps {
                echo 'Hello everyone'
            }
        }
        stage ('say hi') {
            steps {
                echo 'Hi everyone'
            }
        }
    }
}
  • Timeout The timeout option allows you to set a timeout period for your Pipeline run. If the pipeline run times out the duration defined using a timeout, Jenkins aborts the Pipeline. It is possible to use the timeout option at the Pipeline or stage level.
pipeline {
    agent any
    option {
        timeout(time: 2, unit 'HOURS')
    }
    stages {
        stage ('Build') {
            steps {
                bat 'MSBuild.exe MyProject.proj /t:build'
            }
        }
    }
}
  • Timestamps The timestamps option prepend all lines of the console log with a timestamp. The following option is useful in debugging issues where you would need the time of execution of a specific command as evidence.

It is possible to use the timestamps option at the Pipeline or stage level.

pipeline {
    agent any
    option {
        timestamps()
    }
    stages {
        stage ('Build') {
            steps {
                bat 'MSBuild.exe MyProject.proj /t:build'
            }
        }
    }
}

Parameters

The parameters directive allows a user to provide a list of specific parameters when triggering a Pipeline.

  • String
pipeline {
    agent any
    parameters {
        string(name: 'perf_test_dur', defaultValue: '9000', description: 'Performance testing duration')
    }
    stages {
        stage ('Performance Testing') {
            steps {
                // The string parameter is available as an environment variable
                echo "${params.perf_test_dur}"
                // Execute some performance testing
            }
        }
    }
}

  • Text
  • booleanParam
  • Choice
  • File

Triggers

  • Cron The cron trigger accepts a cron-style string to define a regular interval at which the Pipeline should be re-triggered
pipeline {
    agent any
    triggers {
       cron('H */4 * * 1-5')
    }
    stages {
        stage('Long running test') {
            steps {
                // Do something
            }
        }
    }
}

  • Upstream The upstream trigger allows you to define a comma-separated list of jobs and a threshold. When any of the jobs from the list finishes with the defined threshold, your Pipeline gets triggered. This feature is useful if your CI and CD Pipelines are two separate Jenkins Pipelines.
pipeline {
    agent any
    triggers {
        upstream(upstreamProjects: 'jobA', threshold: hudson.model.Result.SUCCESS) }
    }
    stages {
        stage('Say Hello') {
            steps {
                echo 'Hello.'
            }
        }
   }
}

Stage

A Pipeline stage contains one or more steps to achieve a task, for example: Build, Unit-Test, Static Code Analysis, etc. The stage directive contains one or more steps sections, an agent section (optional), and other stage-related directives.

Tools

A tools directive allows you to install a tool on your agent automatically. It can be defined inside a pipeline {} or a stage {} block.

pipeline {
    agent any
    tools {
        maven 'apache-maven-3.0.1'
    }
    stages {
        stage('Build') {
            steps {
                sh 'mvn clean verify'
            }
        }
    }
}

For the tools directive to work, the respective tool must be pre-configured in Jenkins under Manage Jenkins ➤ Global Tool Configuration. However, the tools directive gets ignored if you put an agent none section in the pipeline {} block. As of today, only maven,jdk,gradle work in Declarative Pipeline.

Input

The input directive allows the Pipeline to prompt for input. The input directive works with the stage directive. When used, the stage directive pauses and wait for your input, and when you provide the input, the stage then continues to run.

Options Description Required
message A message to describe the input. Yes
id An identifier to identify the input. Its default value is the stage name. No
ok Alternative text for the “ok” button. Its default value is “ok.” No
submitter A comma-separated list of users or external group names that are allowed to submit the input. Not using the submitter option allows any user to answer the input. No
submitterParameter A name of an environment variable to set with the submitter name, if present. No
parameters A list of parameters to prompt for, along with the input. See the parameters section for more details. No
pipeline {
    agent any
stages {
        stage('Smoke Testing') {
            steps {
                // Perform smoke testing that's quick
                echo 'Performing smoke testing.'
            }
        }
        stage('Full Integration Testing') {
            input {
                message "Should we go ahead with full integration testing?"
                ok "Yes"
                submitter "Luke,Yoda"
                parameters {
                    string(name: 'simulators', defaultValue: '4', description: 'Test farm size')
                }
            }
            steps {
                echo "Running full integration testing using ${simulators} simulators."
            }
        }
    }
}

When

The when directive allows your Pipeline to decide whether a stage directive should be run based on some conditions.

The when directive must contain at least one condition. If the when directive contains more than one condition, all the child conditions must return true for the stage to execute.

Branch

Run a stage directive only if the branch that’s being built matches the branch pattern provided.

A Pipeline {} Block Containing When Directive with Branch Condition:

pipeline {
    agent any
    stages {
        stage('Build') {
            steps {
                echo 'Performinging a Build.'
            }
        }
        stage('Performance Testing') {
            when {
                branch "release-*"
            }
            steps {
                echo 'Performing performance testing.'
            }
        }
    }
}

Test Jenkinsfile

  • developing the code within the Jenkins server as a pipeline project first. Then, you can convert it to a Jenkinsfile afterward.
  • use Blue Ocean mode as a playground
  • declarative pipeline linter application that you can run against Jenkinsfiles, outside Jenkins, to detect problems early. you can use the Jenkins RESTful API to validate the Jenkinsfile syntax by issuing an HTTP/HTTPS POST reques
curl -X POST -L --user USERNAME:TOKEN JENKINS_URL/pipeline-model-converter/validate -F "jenkinsfile=<Jenkinsfile"
ssh -p $JENKINS_SSHD_PORT $JENKINS_HOSTNAME declarative-linter < Jenkinsfile

Replace JENKINS_HOSTNAME and JENKINS_SSHD_PORT variables based on the URL and port where you are running Jenkins.

  • Jenkins Replay button - similar to the Rebuild button but allows you to edit the Jenkinsfile content just before running the job.The Replay button feature allows for quick modifications and execution of an existing pipeline without changing the pipeline configuration or creating a new commit. It’s ideal for rapid iteration and prototyping of a pipeline

Simple pipline

  1. Instal plugins: Docker plugin and Docker Pipeline
  2. create a new job of type Pipeline
  3. Call it Hello Docker
  4. In the Pipeline section, enter this script
pipeline {
     agent { docker { image 'alpine:20191114' } }
     stages {
         stage('build') {
             steps {
                 sh 'echo "Hello, World (Docker for Developers Chapter 7)"'
             }
         }
     }
}
  1. Save the job and click on the Build Now link
  2. Follow the link for #1 that appears on the left and then click on the Console Output button.

Connect to server using ssh

[email protected] is @

  1. Generate ssh key
ssh-keygen -t rsa -b 2048 -f jenkins.shipit

cat jenkins.shipit

2.go to your Jenkins home page, and in the left-hand menu, navigate to the Manage Jenkins link, then to the Manage Credentials link, then navigate to System | Global credentials (unrestricted) of the kind SSH Username with private key. 3. Give it the ID of jenkins.shipit and enter the username of the non-root user from the server 4. Past connetent jenkins.ship to key textpox and save 5.Copy the SSH public key, jenkins.shipit.pub, from your local system to the production server and append it to the ~/.ssh/authorized_keys file.

[email protected]
ssh $prod mkdir -p .ssh
ssh $prod tee -a .ssh/authorized_keys < jenkins.shipit.pub
ssh $prod chmod 700 .ssh ssh $prod chmod 600 .ssh/authorized_keys
  1. pipline
pipeline {
   agent any
   stages {
      stage('SSH') {
         steps {
            withCredentials([sshUserPrivateKey(
                credentialsId: 'jenkins.shipit',
                keyFileVariable: 'keyfile')]) {
                    sh '''
[email protected]
cmd="docker ps"
ssh -i "$keyfile" -o StrictHostKeyChecking=no $prod $cmd

                       '''
                }
         }
      }
   }

''''

### Using Git and GitHub to store your Jenkinsfile
1. Sign in to GitHub, and go to https://github.com/settings/tokens and generate a token that has both the repo and admin:repo_hook scopes. Copy the generated token to the clipboard.
2. navigate through credentials to the Jenkins global credentials and create a Global credentials (unrestricted) credential of the type Username with password and put in your GitHub username, paste the security token from the clipboard, and give it an ID of github.repo.username and a description of GitHub repo credentials (username), but replace username with your actual GitHub username. 




```
pipeline {
    agent {
        node {
            label 'master'
        }
    }
    stages {
        stage('Build') {
            steps {
                sh 'mvn clean verify -DskipITs=true'
            }
        }
        stage('Static Code Analysis') {
            steps {
                sh 'mvn clean verify sonar:sonar'
            }
        }
        stage ('Publish to Artifactory') {
            steps {
                script {
                    def server = Artifactory.server 'Default Artifactory'
                    def uploadSpec = """{
                        "files": [
                            {
                                "pattern": "target/*.war",
                                "target": "helloworld/${BUILD_NUMBER}/",
                                "props": "Unit-Tested=Yes"
                            }
                        ]
                    }"""
                    server.upload(uploadSpec)
                }
            }
        }
    }
}

```
#### Loops in Jenkinsfiles 
```
// Related to https://issues.jenkins-ci.org/browse/JENKINS-26481

abcs = ['a', 'b', 'c']

node('master') {
    stage('Test 1: loop of echo statements') {
        echo_all(abcs)
    }
    
    stage('Test 2: loop of sh commands') {
        loop_of_sh(abcs)
    }
    
    stage('Test 3: loop with preceding SH') {
        loop_with_preceding_sh(abcs)
    }
    
    stage('Test 4: traditional for loop') {
        traditional_int_for_loop(abcs)
    }
}

@NonCPS // has to be NonCPS or the build breaks on the call to .each
def echo_all(list) {
    list.each { item ->
        echo "Hello ${item}"
    }
}
// outputs all items as expected

@NonCPS
def loop_of_sh(list) {
    list.each { item ->
        sh "echo Hello ${item}"
    }
}
// outputs only the first item

@NonCPS
def loop_with_preceding_sh(list) {
    sh "echo Going to echo a list"
    list.each { item ->
        sh "echo Hello ${item}"
    }
}
// outputs only the "Going to echo a list" bit

//No NonCPS required
def traditional_int_for_loop(list) {
    sh "echo Going to echo a list"
    for (int i = 0; i < list.size(); i++) {
        sh "echo Hello ${list[i]}"
    }
}
// echoes everything as expected
```

```
SERVERDIRS = [ "%SERVERLINUXDIR%" , "%SERVERLINUXARMDIR%" ]

pipeline{
    environment {
		SERVERLINUXDIR		="Linux"
		SERVERLINUXARMDIR	="Linux-ARM"
    }
    stages	{
        stage ('Debug') {
            steps	{
				script{
					for (int i = 0; i < SERVERDIRS.size(); i++) {
						bat "echo Test Var ${SERVERDIRS[i]}"
					}
			      }
            }
    }
}
```



.each also works in declarative pipeline like this, using a String[] rather than a List:
```
def platforms = "linux-x64,darwin-x64,linux-arm"
platforms.split(',').each {
	sh "echo Something something ${it} ..."
}
```

fast connections check for different endpoints
```
node('master') {	
	stagesWithTry([
		'https://google.com/'
		,'https://github.com'
		,'https://releases.hashicorp.com/'
		,'https://kubernetes-charts.storage.googleapis.com'
		,'https://gcsweb.istio.io/gcs/istio-release/releases'
	])
}

def stagesWithTry(list){
	for (int i = 0; i < list.size(); i++) {
		try {	
			stage(list[i]){
					sh "curl --connect-timeout 15 -v -L ${list[i]}"
			} 
		} catch (Exception e) {
			echo "Stage failed, but we continue"  
		}
	}
}
```

⚠️ **GitHub.com Fallback** ⚠️