Pipeline definition
Additional information about job pipeline definition.
This section describes the structure and components of the pipeline defined in YAML format. Each stage represents a distinct phase of the pipeline, and within each stage, there are multiple jobs that execute specific tasks.
Basic example of pipeline definition and execution order
build: label: Build firmware stage description: first stage allow-fail: false jobs: build-firmware: job: Job Build Firmware parameters: first_param: first_value second_param: second_value test-firmware: job: Job Test Firmware wait: - build-firmware if: first_param == value use-artifacts: - from: build-firmware collect: "*.so" destination: destination_1 deploy: label: Deploy stage description: second stage allow-fail: false jobs: deploy-firmware: job: Job Deploy Firmware use-artifacts: - from: build-firmware collect: .so destination: destination_2
In this example stages are named build and deploy and they are executed sequentially. Jobs within a stage are executed in parallel unless a Job has a wait directive, which waits for a specific Job within the stage to finish.
Configuration:
- *Stage represents the phase of pipeline. Pipelines can have as many stages as needed. The name of the stage is arbitrary and must be unique.
- Label is an optional directive which presents text that will be shown as the stage name instead of as an identifier in other pages.
- Description is an optional directive which is a brief explanatory text providing information or context about the purpose of the stage.
- Allow-fail is an optional directive to permit or disallow continuation of execution of the remaining stages of the pipeline depending of this stage result.
- *Jobs is a list of jobs that will be executed as part of stage
execution.
- *Job has unique identifier, such as build-firmware or
test-firmware in (Basic example of pipeline definition and execution order), and with key
word job, which is the name of a previously created Job in the
Jobs section,
is determined which job is used.
- Parameters is an optional directive which provides a list of parameters that a user should provide when triggering the Pipeline. Each parameter has a Name and Value.
- Wait is an optional directive which specifies the list of job identifiers which need to be finished before the Job starts. Also, the user needs to pay attention not to form a cycle which can result in deadlock.
- Use-artifacts is an optional directive that specifies a
list of the artifact usages from another job within the same or
previous stages. When using this directive, the Artifact will be
downloaded to the working directory. Please note that if any of
the jobs from the Pipeline are cloning repository or files into
the working directory, the Artifact can be overwritten by those
changes.
- *From is specified source of the artifact.
- *Collect defines which artifacts will be collected.
- Destination specifies the location where artifacts will be collected. If not specified, artifacts will be downloaded into the workspace root.
- If condition that must be met for this stage to be
executed.
- variable/value exist check - if the user adds $parameter_name to the if clause, the Job will be executed as long as the parameter name is defined.
- variable equal(==)/not equal(!=) to a certain value - if the user adds $parameter_name == some_value to the if clause and the values match, the Job will be executed.
- variable containing(~~)/not containing(!~) a certain value - if the user adds $parameter_name ~~ some_value and the value or parameter_name contains some_value, the Job will be executed.
- *Job has unique identifier, such as build-firmware or
test-firmware in (Basic example of pipeline definition and execution order), and with key
word job, which is the name of a previously created Job in the
Jobs section,
is determined which job is used.
Global variables
Inside any stage, global variables can be defined by using the globals option (defining the default value of a variable is optional). Global variables are variables with Pipeline scope that can be used to carry values from one Job execution to other Job execution, for passing the value as a Job execution parameter or inside if clause evaluation.
stage-1:
label: first stage
globals:
- global_variable: value_one
jobs:
job-1:
job: job1
if: global_variable == value_onw
parameters:
job_parameter: $global_variable
Once a global variable is defined in a certain stage, all Jobs that belong to that same or later stages can access its value via an environment variable and update it with the update-tth-globals command. This is an example of using and updating the global variable global_variable inside a job's execution on Linux:
echo $global_variable update-tth-globals global_variable new_value
echo %global_variable%
update-tth-globals global_variable new_value
The global variable's name should be unique across the whole Pipeline definition.
A more complex example of a definition with global variables can be seen here.