Gitlab_CI - kamialie/knowledge_corner GitHub Wiki
By default jobs without stage property get assigned to implicit test
stage.
Another implicit stage is build
.
Each job is executed in a fresh container.
Predefined environment variables - https://docs.gitlab.com/ee/ci/variables/predefined_variables.html List all available variables:
job_name:
script:
- export
Cache - hold on to some files; key is used like an ID of the cache to make it clear when it can be reused. Can also be specified globally. Meant to optimize subsequent pipeline executions that use same packages/dependencies.
build:
stage: build
cache:
key: ${CI_COMMIT_REF_SLUG} # current branch
paths:
- foo/
https://docs.gitlab.com/ee/ci/jobs/job_artifacts.html
Artifacts are designed to store build results of compilation/generation
(intermediate build results), test reports, etc. Should be used when such
artifacts need to be passed between stages. Artifact created in one job can not
be used in another job (which fuckign job is it referred to? pipeline or job) in
the same stage. Actually can with needs
keyword.
Artifacts are shared across jobs of the same stage implicitly.
dependencies
at the job level allow one to specify which specific artifacts
should be fetched.
build:osx:
stage: build
script: make build:osx
artifacts:
paths:
- binaries/
build:linux:
stage: build
script: make build:linux
artifacts:
paths:
- binaries/
test:osx:
stage: test
script: make test:osx
dependencies:
- build:osx
test:linux:
stage: test
script: make test:linux
dependencies:
- build:linux
deploy:
stage: deploy
script: make deploy
Jobs download all artifacts from the completed jobs in previous stages by default. To prevent a job from downloading any artifacts, set dependencies to an empty array ([]).
MAKES NO SENSE - https://handbook.gitlab.com/handbook/customer-success/professional-services-engineering/education-services/gitlabcicdhandsonlab7/
Cache is to be used as temporary
storage for dependencies. Items are stored only if job is successful by
default. Use artifacts.when = always
to change is behavior.
artifacts.reports
accepts some standard test reports that Gitlab can also
visualize.
When cache is enabled, it is shared between pipelines and jobs at the project
level.
If there is a manual job, the one after it is going to run by default. To
prevent that set allow_failure: false
on the manual job. This way further
pipeline execution is blocked.
only
clause allows to explicitly define when a job should run:
build:
stage: build
only:
- master
- merge_requests
Dynamic environment - when job runs for MRs and name of environment includes a
variable, thus, creating environments dynamically. By default dynamic
environments continue to run even after branch has been removed. Stop button
doesn't do anything unless configured - a separate stage with
environment.action = stop
and original job that created an env should link to
it via environment.on_stop = <job_name>
. Now when MR is merged (and branch is
deleted) stop_env
job will be triggered on the MR.
deploy_review:
stage: deploy
environment:
name: review/$CI_COMMIT_REF_NAME
url: example.com
on_stop: stop_env
stop_env:
stage: deploy
only:
- merge_requests
variables:
GIT_STRATEGY: none
script:
- echo "Remove env"
when: manual
environment:
name: review/$CI_COMMIT_REF_NAME
action: stop
before_script
defines commands/scripts to run before script
block. Can be
defined globally or locally within job. after_script
works in a similar
fasion, but working directory is set to default, and commands are executed in a
different context than script
or before_script
.
Disable a job by prepending .
in the name, e.g. .build
.
YAML anchors allow referencing a value of another key, thus, avoid duplication.
Anchor requires creating aliases, which is a key prepended with &
. To use a
value of an alias use *
.
person:
name: &first_name John
fullName: *first_name
Anchor can also merge an entire object. This concept can be used to define disabled jobs, which essentially become templates, where common structure is defined. Templating would be done through variables in jobs that are going to utilize this template.
foo: &foo
key_1: value_1
key_2: value_2
bar:
<<: *foo
key_3: value_3
.build_template: &build
stage: build
script:
- echo "Running $TYPE build"
build_prod:
<<: *build
variables:
TYPE: prod
build_staging:
<<: *build
variables:
TYPE: staging
Overwrite entrypoint for container:
build:
image:
name: foo
entrypoint: [""]
Stages occur in sequence by default; order of stages/jobs can be adjusted with keywords.
defaults
sets default value to job parameters and applies them to all jobs,
which can overwrite them if needed. Most commonly used parameters are image
and services
.
https://docs.gitlab.com/ee/ci/variables/
- Predefined
- GitLab UI (project, group, instance level)
- YAML file
- when triggering pipeline (also as URL parameters)
Variables can also be set in one job to be inherited by next job.
build:
stage: build
script:
- echo "BUILD_VARIABLE=value_from_build_job" >> build.env
deploy:
stage: deploy
variables:
BUILD_VARIABLE: value_from_deploy_job
script:
- echo "BUILD_VARIABLE=value_from_build_job" # value will be value_from_build_job due to precedence
Precedence from highest to lowest:
- UI when triggering the pipeline (or API request)
- variables configured for project, group, instance
- inherited variables
- configured in
variables
block - predefined variables
Variables in the pipeline configuration can be defined globally or on job level.
https://about.gitlab.com/blog/2020/12/10/basics-of-gitlab-ci-updated/ https://about.gitlab.com/blog/2022/03/09/efficient-pipelines/
Rules are evaluated in order until the first match, so order matters. When the first condition is met, the job is either included or excluded from the pipeline depending on the configuration. https://docs.gitlab.com/ee/ci/jobs/job_control.html
defualt values:
when: on_success
allow_failure: false
Job doesn't get triggered, if no standalone when: on_success
, when: delayed
, nor
when: always
rule is present or if rule matches when: never`. Otherwise, job
runs. Standalone is just a last rule in the list.
Only on MR:
job:
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
## needs
Allow to define direct dependencies so that jobs defined even in different
stages can run in parallel. Results in directed acyclic graph (DAG).
# Templates
`include` keyword is used to pull a regular pipeline configuration file into
current one:
- `include.template` - content provided by GitLab
- `include.project`, `include.file` - file located in the same group
- `include.local` - file located in the same repo
- `include.remote` - publicly available outside of your org
https://about.gitlab.com/blog/2023/05/01/how-to-build-reusable-ci-templates/
https://about.gitlab.com/blog/2023/07/10/introducing-ci-components/
https://about.gitlab.com/blog/2023/05/08/use-inputs-in-includable-files/
# extends
Similar to YAML anchors, but also allows allows to reuse configurations from
different YAML files. Supports up to 11 nesting level, while 3 is recommended.
# Security and Compliance
https://handbook.gitlab.com/handbook/customer-success/professional-services-engineering/education-services/gitlabcicdhandsonlab9/
# Links
+ https://docs.gitlab.com/ee/ci/triggers/
+ https://docs.gitlab.com/ee/ci/yaml/
+ https://docs.gitlab.com/ee/ci/pipelines/pipeline_efficiency.html
+ https://docs.gitlab.com/ee/user/group/compliance_frameworks.html
+ https://docs.gitlab.com/ee/ci/yaml/yaml_optimization.html
+ https://docs.gitlab.com/ee/ci/yaml/inputs.html