How I build a Gitlab CI pipeline as a shared common library pipeline

Nguyễn Hoàng Minh Quân
6 min readNov 29, 2023

--

In this article, I will explain how to create a shared common library pipeline, it is a centralized repository of tools, libraries, and CI/CD configurations that can be used across different projects.

Benefits of the shared common library pipeline

  1. Efficiency: Developers spend less time configuring pipelines.
  2. Reliability: Reduced errors and improved version management contribute to more reliable.
  3. Scalability: As projects grow, the shared common library pipeline can accommodate new components and updates without disrupting pipelines.
  4. Standardization: The shared common library pipeline enforces standardized versions of tools and libraries, reducing compatibility issues and ensuring consistent build environments.

Yaml engineering

There is a set of powerful configuration features for GitLab’s CI/CD pipelines that, used correctly, can facilitate, durable, and parsable pipeline configuration files.

  • anchors (&) allow you mark a section of config for future reference.
  • aliases (*) refer back to the anchored section of config.
  • merge keys (<<:) allow you to insert an anchored section of config
  • extends: reuse configuration in multiple jobs.
  • reference: custom YAML tag to select keyword configuration from other job sections and reuse it in the current section.
  • inputs & include: to define input parameters for CI/CD configuration intended to be added to a pipeline with include. Use include:inputs to pass input values when building the configuration for a pipeline.
  • Hide job: To temporarily disable a job without deleting it from the configuration file.
  • Override
  • Merge

Let’s start

To illustrate, I will use a Helm CD pipeline, let's begin by breaking down the yaml file instead of using the regular .gitlab-ci.yml. The file structure will look like the following.

├── blueprint-template.yml
├── jobs
│ ├── common.yml
│ ├── helm-prepare.yml
│ ├── helm-upgrade.yml
│ └── validation.yml
├── stages
│ └── deploy-common.yml

let’s dive to job folder, which is the location for the common functions.

The common function can be considered as common bash functions, common logic, common jobs or common before_script.

To implement, I use hidden jobs that start with . as templates for reusable configuration with the extends keyword or reference keyword as follows.

jobs/common.yml

.assume-role:
script: |
function assume_role() {
local role_arn="$1"
local external_id="$2"

if [ -z "${external_id}" ]; then
echo "Assumming role ${role}"
aws sts assume-role --role-arn "${role_arn}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
else
echo "Assumming role ${role} with external id ${external_id}"
aws sts assume-role --role-arn "${role_arn}" --external-id "${external_id}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
fi

export AWS_ACCESS_KEY_ID=$(cat assume-role-output.txt | jq -c '.Credentials.AccessKeyId' | tr -d '"' | tr -d '')
export AWS_SECRET_ACCESS_KEY=$(cat assume-role-output.txt | jq -c '.Credentials.SecretAccessKey' | tr -d '"' | tr -d '')
export AWS_SESSION_TOKEN=$(cat assume-role-output.txt | jq -c '.Credentials.SessionToken' | tr -d '"' | tr -d '')
}

I have a function AWS assume role, you might be wondering “why I declare a bash function block?”. When declaring a bash function block, you include once and you can execute it multiple times as follows.

.helm-prepare:
script:
- !reference [.assume-role, script]
- assume_role "${ASSUME_ROLE_ARN}" "${EXTERNAL_ID}"

next to stages folder, can be considered as a group of job to create a general workflow.

I use inputs keywork, with the ability to use input parameters in CI templates, you can replace any keyword in the template with a parameter, including stage, script, or job name. In my case, I want to deploy to several AWS account, it keep my code is DRY.

Ref: https://about.gitlab.com/blog/2023/05/08/use-inputs-in-includable-files/

spec:
inputs:
environment:
values_file_directory:
---
.private:
variables:
release_name_$[[ inputs.environment ]]: ""

include:
- jobs/common.yml
- jobs/helm-prepare.yml
- jobs/helm-upgrade.yml
- jobs/validation.yml

Validate changes $[[ inputs.environment ]]:
stage: Deploy $[[ inputs.environment ]]
extends:
- .validation
- .private
environment:
name: $[[ inputs.environment ]]
rules:
- if: '$VALIDATION == "true"'
changes:
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/values.yaml
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/Chart.yaml

Helm prepare $[[ inputs.environment ]]:
stage: Deploy $[[ inputs.environment ]]
extends:
- .helm-prepare
- .private
environment:
name: $[[ inputs.environment ]]
rules:
- if: '$VALIDATION == "true"'
needs:
- Validate changes $[[ inputs.environment ]]
changes:
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/values.yaml
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/Chart.yaml
- if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH'
changes:
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/values.yaml
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/Chart.yaml

Helm upgrade $[[ inputs.environment ]]:
stage: Deploy $[[ inputs.environment ]]
extends:
- .helm-upgrade
- .private
environment:
name: $[[ inputs.environment ]]
needs:
- Helm prepare $[[ inputs.environment ]]
when: manual
rules:
- if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH'
changes:
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/values.yaml
- $[[ inputs.values_file_directory ]]/$[[ inputs.environment ]]/Chart.yaml

I have a master template called blueprint-template.yml, where all the common functions and common workflows are called to create a complete process.

include:
- local: stages/deploy-common.yml
inputs:
environment: nonprod
values_file_directory: ${VALUES_FILE_DIRECTORY}
- local: stages/deploy-common.yml
inputs:
environment: preprod
values_file_directory: ${VALUES_FILE_DIRECTORY}
- local: stages/deploy-common.yml
inputs:
environment: prod
values_file_directory: ${VALUES_FILE_DIRECTORY}


image: .dkr.ecr.ap-southeast-1.amazonaws.com/gitlab-runner/k8s:v1.0.3


stages:
- Deploy nonprod
- Deploy preprod
- Deploy prod

Share it for across different projects (development team)

To use this pipeline in another place as a common library pipeline I use the include key work in .gitlab-ci.yml.

include:
- project: shared/devops/shared-gitlab-blueprints/helm-cd
ref: main
file: blueprint-template.yml

At this point I hope you have understood how to create a shared common library pipeline, and shared across different projects.

Tips

Here’s a tip that you can consider good practice with shared common library pipeline.

  1. common.yml

Using bash function to encapsulate specific tasks, especially when them need argument to execute.

Recommended:

.assume-role:
script: |
function assume_role() {
local role_arn="$1"
local external_id="$2"

if [ -z "${external_id}" ]; then
echo "Assumming role ${role}"
aws sts assume-role --role-arn "${role_arn}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
else
echo "Assumming role ${role} with external id ${external_id}"
aws sts assume-role --role-arn "${role_arn}" --external-id "${external_id}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
fi

export AWS_ACCESS_KEY_ID=$(cat assume-role-output.txt | jq -c '.Credentials.AccessKeyId' | tr -d '"' | tr -d '')
export AWS_SECRET_ACCESS_KEY=$(cat assume-role-output.txt | jq -c '.Credentials.SecretAccessKey' | tr -d '"' | tr -d '')
export AWS_SESSION_TOKEN=$(cat assume-role-output.txt | jq -c '.Credentials.SessionToken' | tr -d '"' | tr -d '')
}

Not recommended:

.assume-role:
script: |
if [ -z "${external_id}" ]; then
echo "Assumming role ${role}"
aws sts assume-role --role-arn "${role_arn}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
else
echo "Assumming role ${role} with external id ${external_id}"
aws sts assume-role --role-arn "${role_arn}" --external-id "${external_id}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
fi

export AWS_ACCESS_KEY_ID=$(cat assume-role-output.txt | jq -c '.Credentials.AccessKeyId' | tr -d '"' | tr -d '')
export AWS_SECRET_ACCESS_KEY=$(cat assume-role-output.txt | jq -c '.Credentials.SecretAccessKey' | tr -d '"' | tr -d '')
export AWS_SESSION_TOKEN=$(cat assume-role-output.txt | jq -c '.Credentials.SessionToken' | tr -d '"' | tr -d '')

hide job should have only script block.

Recommended:

.assume-role:
script: |
function assume_role() {
local role_arn="$1"
local external_id="$2"

if [ -z "${external_id}" ]; then
echo "Assumming role ${role}"
aws sts assume-role --role-arn "${role_arn}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
else
echo "Assumming role ${role} with external id ${external_id}"
aws sts assume-role --role-arn "${role_arn}" --external-id "${external_id}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
fi

export AWS_ACCESS_KEY_ID=$(cat assume-role-output.txt | jq -c '.Credentials.AccessKeyId' | tr -d '"' | tr -d '')
export AWS_SECRET_ACCESS_KEY=$(cat assume-role-output.txt | jq -c '.Credentials.SecretAccessKey' | tr -d '"' | tr -d '')
export AWS_SESSION_TOKEN=$(cat assume-role-output.txt | jq -c '.Credentials.SessionToken' | tr -d '"' | tr -d '')
}

Not recommended:

.assume-role:
variables:
AWS_ACCOUNT_ID: xxx
ASSUME_ROLE_ARN: xxx
EXTERNAL_ID: xxx
script: |
function assume_role() {
local role_arn="$1"
local external_id="$2"

if [ -z "${external_id}" ]; then
echo "Assumming role ${role}"
aws sts assume-role --role-arn "${role_arn}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
else
echo "Assumming role ${role} with external id ${external_id}"
aws sts assume-role --role-arn "${role_arn}" --external-id "${external_id}" --role-session-name "CI_${CI_PROJECT_NAME}_${CI_PIPELINE_IID}" > assume-role-output.txt
fi

export AWS_ACCESS_KEY_ID=$(cat assume-role-output.txt | jq -c '.Credentials.AccessKeyId' | tr -d '"' | tr -d '')
export AWS_SECRET_ACCESS_KEY=$(cat assume-role-output.txt | jq -c '.Credentials.SecretAccessKey' | tr -d '"' | tr -d '')
export AWS_SESSION_TOKEN=$(cat assume-role-output.txt | jq -c '.Credentials.SessionToken' | tr -d '"' | tr -d '')
}

2. Override default variables

You can override default variables in blueprint-template.yml when you include it in .gitlab-ci.yml

include:
- project: shared/devops/shared-gitlab-blueprints/helm-cd
ref: main
file: blueprint-template.yml

variables:
VALUES_FILE_DIRECTORY: overwrite-values
VALUES_SCHEMA: argocd-application-retail-schema
K8S_NAMESPACE: default

3. Adopt a naming convention
Name all bash functions using underscores to delimit multiple words. Name all hide job using dash to delimit multiple words.

--

--

No responses yet