Microtica allows you to define custom variables that can be used to parametrize the way your pipeline works.

In each pipeline, Microtica will automatically inject a predefined set of variables like MIC_PIPELINE_EXECUTION_ID, MIC_USER_NAME, MIC_COMMIT_REF etc. that can be used in conjunction with the custom variables.

Use pipeline variables when you want to:

  • parametrize the way your pipeline works
  • avoid hard-coding within the step execution logic
  • avoid storing sensitive information within the pipeline yaml

Microtica offers two types of variables:

  • plaintext variables
  • sensitive variables

Adding variables can be done during creating/modifying pipeline from the GUI. Environment variables will automatically became available to use in pipeline runtime environment using a standard Unix syntax ${variable-name}.

Pipeline environment variables

In the following example, we are using predefined variables SONAR_URL and SONAR_LOGIN.

We can reference a custom as well as Microtica predefined variables directly in the step parameters. In this case, we are setting the tag for the Docker image to be the value of “{{MIC_PIPELINE_EXECUTION_ID}}”.


    image: node:latest
      - sonar-scanner${SONAR_URL} -Dsonar.login=${SONAR_LOGIN}
    type: docker-push
    image_name: microtica/user-service
    registry: dockerhub

As we marked SONAR_LOGIN variable as sensitive, the value will be stored encrypted and securely decrypted only within the step execution boundary.

Usage syntax

Use ${variable-name} syntax when you want to reference the built-in or custom environment variables from the commands section (i.e. step runtime instructions).

Use “{{variable-name}}” syntax as a templating in the microtica.yaml file. Variables will be replaced with the respective values before pipeline execution.

When using templating, always put variables in quotation marks. 

Security consideration

Never hard-code sensitive data in pipeline yaml.

For security reasons, always enable the sensitive flag when specifying parameters like passwords, secret tokens or any other data which is considered as sensitive.

#Predefined variables

In each step execution, Microtica will automatically inject a set of predefined variables you can use within the pipeline.

MIC_COMMIT_TYPEThe type of reference with the change. Possible values: branch or tag.
MIC_COMMIT_REFCommit ref. For commit type branch the value will be the commit hash. For tags, the values will be the name of the tag.
MIC_COMMIT_NAMEThe name of the branch or tag.
MIC_BRANCH_FILTERPipeline branch filter. The pipeline will be triggered only if the commit branch matches the branch filter.
MIC_COMMIT_MESSAGECommit message associated with the Git repository revision.
MIC_GIT_PROVIDERGit provider. Possible values: bitbucket or github.
MIC_GIT_ACCESS_TOKENTemporary Git access token for the repository associate with the pipeline.
MIC_MANUAL_TRIGGERIndicates if the pipeline was triggered by Git webhook or manually by the user. Possible values: true or false.
MIC_PIPELINE_IDThe pipeline id.
MIC_PIPELINE_NAMEThe pipeline name.
MIC_PIPELINE_EXECUTION_IDA unique identifier for the current pipeline execution.
MIC_PIPELINE_EXECUTION_TIMESTAMPTimestamp of the pipeline trigger time in UTC format.
MIC_USER_IDUser id. For manual pipeline trigger the value is Microtica user id, for Git webhook it’s the Git user id.
MIC_USER_NAMEThe name of the user who triggered the pipeline.
MIC_USER_AVATARURL of the user avatar picture.
MIC_WORKDIRPath to the Git subdirectory. Suitable for monorepos. If specified, the pipeline will be triggered only if files in the specified dir are modified.
Microtica expects to find microtica.yaml in the specified dir.
MIC_REPO_FULLNAMERepository full name.
MIC_REPO_NAMERepository name.

#Step variables

Each generic step can export data in JSON format that can be referenced from other steps.

To allow step to export data you have to use JSON Artifacts notation in that specific step.

A common scenario would be when you provision infrastructure in one step and then you deploy an application in the next step. But, you need information for the provisioned infrastructure to achieve that.


    image: hashicorp/terraform:latest
      - terraform init
      - terraform apply -auto-approve
      - terraform output -json > terraform.output
      json: terraform.output
    image: aws/codebuild/standard:4.0
      - aws s3 sync . ${DeployInfra.artifactsBucket}

So we defined an artifact of type json and specified the path to the json file which in this case is terraform.output file.


  "artifactsBucket": "my-website-bucket"

The variable artifactsBucket exported from the first step can be referenced in the second step by simply using ${DeployInfra.artifactsBucket} syntax where DeployInfra is the name of the first step.

Similarly, you can build and push a Docker image in one step and use the image URL in the next step to trigger deployment.

    type: docker-push
    image_name: user-service
    tag: v0.1.0
    registry: dockerhub
    image: node:latest
      - ./deploy-my-app --image ${DockerPush.image}