Gitlab CI : write pipelines

General hints to write pipeline

Validate a yaml

Go in the gui/CI/CD/pipelines menu and then click on the CI Lint button and paste your yaml file.
Note that it doesn’t cover all issues but a lot of.

We could also valid yaml with more general tool but it will cover only the yaml syntax and no anything related to gitlab syntax.
Go to the Yaml Basic page to find out how to do.

Enable debug traces

Set the ci debug trace var to true : CI_DEBUG_TRACE: "true"
!!!!! YAML’s block styles employ indentation rather than indicators to denote structure. This results in a more human readable (though less compact) notation. !!!!!

Common yaml invalid error

We can validate the yaml of the pipeline according to general yaml syntax (point above).
Validation errors may also be related to Gitlab yaml pipeline specification.
Common errors :
– gitlab-ci.yml job foo-job config key may not be used with `rules`: when, only.
Cause : for a job definition, both rules and only(or except) elements were specified. It is disallowed.
Solution: Either we specify rules or only/except.

Multiple statements in a script declaration

We could use the Block scalar styles(>, |).
These allow characters such as \ : " without escaping, and add a new line (\n) to the end of the string and extra indentation are also not folded.

Block folded style (>)
Folded style removes single newlines within the string, but adds one at the end. It also converts double newlines to singles :

script:
    - >
      if [[ "${ANY_VAR}" = "fooValue" ]]; then 
          echo "foo...."
      else 
          echo "bar...."
      fi

Block literal style (|)
Literal style turns every newline within the string into a literal newline, and adds one at the end

script:
    - |
      if [[ "${ANY_VAR}" = "fooValue" ]]; then 
          echo "foo...."
      else 
          echo "bar...."
      fi

other example:

    - |
      status_code=$(curl -s -w %{http_code} -o /dev/null -H "Content-Type: application/json" -X GET "localhost:8090/person") || true

Flow scalar style  
!!!!ASSERTION TO CHECK !!!! These have limited escaping, and construct a single-line string with no new line characters. They can begin on the same line as the key, or with additional newlines first.
Visual newlines as replaced by spaces while Lines with extra indentation are also not folded.

Plain scalar : no escaping, no # or : combinations, limits on first character. 
Example : 

script:
- echo -e
   "echo 'hi';
    echo 'bye';"

multi-line scalar value  
The pipe symbol at the end of a line in YAML signifies that any indented text that follows should be interpreted as a multi-line scalar value.
Except for the indentation, it is interpreted literally in such a way that preserves newlines.
Example : …

Push options

Gitlab provides two :
ci.skip : skip CI pipeline for the push (Gitlab 11.7)
ci.variable "name=value" : provide CI/CD variables to be used in a CI pipeline (Gitlab 12.6)

Examples :
git push -o ci.skip
git push -o ci.variable="MAX_RETRIES=10" -o ci.variable="MAX_TIME=600"

Variables

Use String as value variable, don’t try to use other types such as as boolean as it may make the template invalid

Predefined environment variables

There are many of them and Most of them are related to the current pipeline/job while a few are related to the general CI/CD configuration of our Gitlab (such as CI_SERVER_HOST) .
To see the full list of env variables with their current values, refer to the above CI_DEBUG_TRACE: "true".

Custom environment variables

These have to be defined in the UI and can so be used inside .gitlab-ci.yml
There are two flavors :
– Variable type : classic key-value variable.
– File type : same thing but the the value is stored in a temporary file and that the value associated to the key returns the temporary file path.
A variable may be also set as protected or masked. The latter make the variable value to be masked in job logs while the former exposes the variable only in protected branches and tags.

Define variables in .gitlab-ci.yml 

Variables can be defined at a global level and at a job level.
Example at global level :

variables:
  DATABASE_URL: "postgres://postgres@postgres/my_database"

Example at stage level :

Package with Maven:
  stage: maven-build
  variables:
    DATABASE_URL: "postgres://postgres@postgres/my_database"

To turn off global defined variables in your job, define an empty hash :

job_name:
  variables: {}

Jobs, Stages and their execution

Define jobs and stages

In Gitlab CI, the execution unity is the job : that is the circle with a label what you see/execute/retry in the gitlab ci pipeline view.
Jobs are declared as first level elements and have two mandatory parts : the script and the tag element.
Jobs execution order depends on their stage value and the order of that stage value in the stages definition.
We can have a stage referenced by a single job but it could also be referenced by multiple jobs.
For example below :
– « Maven Build » and « Maven Build Other app » are 2 jobs that belong to the maven-build stage.
– « Docker Build » and « Docker Build Other app » are 2 jobs that belong to the docker-build stage.
– In accordance with the order declaration of stages, Gitlab Ci executes first the two maven-build jobs then the two docker-build jobs
– maven jobs are executed in parallel, when both succeed then docker jobs are executed in parallel. Indeed jobs belonging to the same stage run in parallel

stages:
  - maven-build
  - docker-build
 
## APP ONE
Maven Build:
  stage: maven-build
  tags:
    - maven3-jdk11
  # not mandatory because the tag already defines a default image but is required to specify another image
  image: maven:3.6.3-jdk-11
  variables:
    foo: bar
  script:
    - echo "maven builds something"
 
Docker Build:
  stage: docker-build
  tags:
    - docker
  variables:
    foo-foo: "bar-bar"
  script:
    - echo "docker builds something"
 
## APP TWO
Maven Build Other app:
  stage: maven-build
  tags:
    - maven3-jdk11
  # not mandatory because the tags already defines a default image but is required to specify another image
  image: maven:3.6.3-jdk-11
  variables:
    foo: bar
  script:
    - echo "maven builds something else"
 
Docker Build Other app:
  stage: docker-build
  tags:
    - docker
  variables:
    foo-foo: "bar-bar"
  script:
    - echo "docker builds something else"

when : when a stage should be executed

That is defined by stage and it accepts an array of values.

In most of cases, we don’t need to specify the when attribute because its default value is nice : on_success.
It means that the current stage is executed only if that is the first stage or if the previous was successful.
So by default, the pipeline stops as soon as a stage fails.

But in some corner cases, we want to execute a stage in case of failure or despite the failure or still manually.
So when becomes helpful.

Valid when values :
on_success (default) – Execute job only when all jobs in earlier stages succeed (including those with allow_failure: true)
on_failure – Execute job only when at least one job in an earlier stage fails.
always – Execute job regardless of the status of jobs in earlier stages.
manual – Execute job manually.
delayed – Delay the execution of a job for a specified duration. Added in GitLab 11.14.

Extends to reuse configuration sections

A great use case is creating a kind of abstract jobs by prefixing their name by the « . » char.
Then we could allow any concrete job to inherit its configuration by specifying the attribute extends.
Using « extends » reduces duplication of jobs definition.
Inside a same application we may repeat very similar jobs : deploy-dev, deploy-homolog, deploy-prod with only few variance between them.
Abstract jobs may be defined :
– in the same yaml template as the concrete jobs that inherit them.
– in their own yaml template and be imported by other templates via include feature.
Besides, sharing jobs between applications is also good. In that case, combining extends to include feature is so strongly advised.
Here is an example of a maven build abstract template defined in its own template that is used by another yml template.
maven3-jdk11-build.yml:

.maven3-build:
  stage: maven-build
  variables:
    mavenBuild_maven_opts: "-Dmaven.repo.local=.m2/repository"
  tags:
    - maven3-jdk11
  script:
    - mvn ...

And application template can use it in its pipeline.
For example here we have two jobs that extends the abstract .maven3-build job while these jobs may still define or override elements. For example here the projectToBuild_relativeDir variable is overriden if existing earlier or else creating :

include:
  - project: basic-projects-using-shared-ci/ci-common
    file: maven3-jdk11-build.yml
 
stages:
  - maven-build
  - any-other-stage
 
Maven Build application:
  stage: maven-build
  extends: .maven3-build
  variables:
    projectToBuild_relativeDir: "application"
 
Maven Build application bis:
  stage: maven-build
  extends: .maven3-build
  variables:
    projectToBuild_relativeDir: "application-bis"
 
Foo other stage:
  stage: any-other-stage
  script: 
    - echo "hello..."

Template include VS parent-child pipeline

Including a template and triggering a child pipeline are two different beasts.
The declaration of include as first-level elements of a yaml template doesn’t generate a distinct pipeline.
Finally gitlab merges included element(s) with the yml that includes that.
Whereas declaring child pipelines means triggering the execution of multiple and distinct pipelines.

Template include (not in a parent-child pipeline context)

Include elements : merging and overriding rules

The included files are:
– Deep merged with those in the .gitlab-ci.yml file.
– Always evaluated first and merged with the content of the yml file that includes it, regardless of the position of the include keyword.
It means that the current yml file may still override the included yml.

About merging of elements, there are two cases :
Case 1 : most of array elements such as « stages » (at root level) or « script » at (job level) are overriden such as : all declared in the included are removed and those declared in the client template are added.
For example, either maven-build.yml :

maven-build:
  stage: maven-build
  script:
    - echo "hello"

and .gitlab-ci.yml declared as :

include:
  - local: maven-build.yml
maven-build:
  stage: maven-build
  script:
    - echo "world"

The maven-build job in the final template will not contain the two script instructions (echo "hello" and then echo "world") but only : echo "world" because these array elements are not merged but completely replaced

Case 2 : no array elements such as « variables » (for « variables »b : oth at root level and at job level) are added to the existing declared elements.
It means that the template that includes « variables » will either add a new variable or overriding it if existing in the included template but it will never remove existing « variables » elements.
For example, either my-vars.yml :

variables:
  foo: "foo value" 
  bar: "bar value"

and .gitlab-ci.yml declared as :

include:
  - local: my-vars.yml
variables:
  foo: "new foo value" 
  OtherFoo: "other foo value"

In the final template :
– the variable foo is overriden with the new value (« new foo value »)
– the variable otherFoo is added
– the variable bar is kept with its original value (« bar value »)

Some use cases of include

Reuse of repeated declarations such as global variables and common included templates

These include are generally used by multiple projects/repositories.
For example maven applications need gitlab jobs to build, test, create docker image or even release the application.
Theses jobs rely on common things such as the directory of the maven module/app to build or still the directory inside it where the packaged application reside.
We may extract them as job variables but it means that when we need to override them, we should do it in each job.
It is an error prone approach. Instead, we could extract these variables into a shared yaml template that all yaml jobs will include and refer in their script part. Then, if we need to override these variables, we only need to do it in the higher level yaml.
Sample :
maven-app-global-include.yml:

variables:
  projectToBuild_relativeDir: "."
  projectToBuild_jarDir: "target"

could be included in a maven3-jdk11-build.yml template :

include:
  - local: maven-app-global-include.yml
 
.maven3-build:
  stage: maven-build
  script:
    - echo "using of ${projectToBuild_relativeDir}"

It could also be included in a docker-build.yml template :

include:
  - local: maven-app-global-include.yml
 
.docker-build:
  stage: docker-build:
  script:
    - echo "Here also using of ${projectToBuild_relativeDir}"

At last, the higher yml : the .gitlab-ci.yml pipeline could include these two last template and override the variables if make sense :

include:
  - project: basic-projects-using-shared-ci/ci-common
    file: maven3-jdk11-build.yml
  - project: basic-projects-using-shared-ci/ci-common
    file: docker-build.yml
 
variables:
  # we override an inherited variable at a single place, below :
  projectToBuild_jarDir : "target/dependency"

Including all common includes in a single include

For example here maven3-jdk11-build.yml and docker-build.yml could be extracted into a template and higher templates would need only to include it instead of each individual template.

Including an « abstract » job

As seen earlier, we could create abstract jobs by prefixing its name by the « . » char.
Generally abstract jobs are designed to be reused inside the same project/repos and also between multiple projects/repos.
So good practice to extract them in their own templates.

Define a « static » template to split a big yml pipeline into smaller parts

The idea is to move some jobs into a dedicated yml fragment of the pipeline because that or theses jobs are maybe very particular and moving them into a specific yml fragment makes them and the overall pipeline clearer and maintainable

Parent and child pipelines

Features

Makes sense for mono repository approach.
Here are main features :
– allows a repository with multiple projects to create a parent pipeline that creates a distinct child pipeline by project or « matter » and orchestrates them
– parent pipeline is free to define any child pipeline that doesn’t depend on a specific project (build common deps, perform checks or tests before delivery, global release, delivery).
– each pipeline(parent as well childs) is free to define its own structure : stages definition for example
– the parent pipeline and the child pipelines have their own state and view in the CI.
Specially, rules defined for a child in the parent pipeline and rules defined directly in the child pipeline are cumulative and are applied at two distinct levels.
– likewise classic pipeline (no parent-child), jobs (that are finally child pipelines)  defined in the parent pipeline with the same stage value run in parallel.
– child pipeline state may be propagated to parent pipeline

Real example and caveat

Suppose we have a mono repository with 2 independent applications or micro services, a common library used by these. We could imagine a pipeline with stages in that order such as :
– build common library
– build the two apps (in parallel to go faster)
– release the app (when we are on the master branch)
– check that the release is good before delivery
– deploy the app in the target env

Here is the parent pipeline :

# include on parent pipeline take priority override include on child pipelines
# so no include here
stages:
  - build-common-libs
  - build-apps
  - delivery-checks
  - release-all
 
build common libs:
  stage: build-common-libs
  trigger:
    include: '/common-api/.common-api-gitlab-ci.yml'
    strategy: depend
  rules:
    - if: '$CI_PIPELINE_SOURCE == "push"'
      changes:
        - common-api/**/*
    - if: '$CI_PIPELINE_SOURCE == "web"'
      changes:
        - common-api/**/*
 
 
foo app build:
  stage: build-apps
  trigger:
    include: '/foo-project/.foo-gitlab-ci.yml'
    strategy: depend
  rules:
    - if: '$CI_PIPELINE_SOURCE == "push"'
      changes:
        - foo-project/**/*
        - common-api/**/*
    - if: '$CI_PIPELINE_SOURCE == "web"'
      changes:
        - foo-project/**/*
        - common-api/**/*
 
bar app build:
  stage: build-apps
  trigger:
    include: '/bar-project/.bar-gitlab-ci.yml'
    strategy: depend
  rules:
    - if: '$CI_PIPELINE_SOURCE == "push"'
      changes:
        - bar-project/**/*
        - common-api/**/*
    - if: '$CI_PIPELINE_SOURCE == "web"'
      changes:
        - bar-project/**/*
        - common-api/**/*
 
release all:
  stage: release-all
  trigger:
    include: '/gitlab-parent-internal-pipelines/.release-all-gitlab-ci.yml'
    strategy: depend
  # rules defined both in children and parent (here) are cumulative.<br />  # Parent rules are applied on the first view of the CI while <br />  # child rules are applied on the view of the child pipeline. So we must be careful
  # We define them in the parent otherwise we may have weird view in the gui
  rules:
    - if: '$CI_COMMIT_REF_NAME == "master"'
      when: manual
 
 
delivery checks:
  stage: delivery-checks
  trigger:
    include: '/gitlab-parent-internal-pipelines/.delivery-checks-gitlab-ci.yml'
    strategy: depend
  rules:
    - if: $CI_COMMIT_TAG

And here some of child pipelines (note that child rely themselves on common template for their core job) :

build-common-libs pipeline :

include:
  - project: basic-projects-using-shared-ci/ci-common
    file: maven3-jdk11-install-library.yml
 
stages:
  - maven-install-library
 
install common library:
  variables:
    projectToBuild_relativeDir: "common-api"
  stage: maven-install-library
  tags:
    - maven3-jdk11
  extends: .maven3-install-library

Foo pipeline :

include:
  # common templates
  - project: basic-projects-using-shared-ci/ci-common
    file: maven3-jdk11-build.yml
  - project: basic-projects-using-shared-ci/ci-common
    file: docker-build.yml
 
variables:
  projectToBuild_relativeDir: "foo-project/foo-application"
 
maven build foo:
  stage: maven-build
  tags:
    - maven3-jdk11
  extends: .maven3-build
  variables:
    # override the client opts but don't override the global maven opts
    mavenBuild_maven_client_opts: "-Dmaven.wagon.http.ssl.insecure=true -Dmaven.wagon.http.ssl.allowall=true -Dmaven.wagon.http.ssl.ignore.validity.dates=true"
 
Docker Build foo:
  stage: docker-build
  extends: .docker-build
  variables:
    dockerBuild_dockerImage: "foo-project"
 
stages:
  - maven-build
  - docker-build

Release all pipeline :

include:
  - project: basic-projects-using-shared-ci/ci-common
    file: maven3-jdk11-release.yml
 
stages:
  - release-all-child
 
release all child:
  stage: release-all-child
  extends: .maven3-release
  # we override rule defined .maven3-release to remove the "when:manual" part.
  # Otherwise we need to click twice to trigger the job in the UI : in the parent pipeline (mandatory) and here
  rules:
    - if: '$CI_COMMIT_REF_NAME == "master"'
  variables:
    mavenRelease_same_version_for_base_project_and_modules: "true"

Common gitlab pipelines

Maven global cache

The gitlab cache feature allows to cache maven artifacts by project or finer as project/branches.
Sometimes we want to cache maven artifacts for any Gitlab projects.
To achieve it, we configure a volume in the maven runner config.

[[runners]]
  name = "maven3"
  url = "http://gitlab.david.com:8585/"
  token = "by_rHd1-QGvRs4tG4KCi"
  executor = "docker"
  ...
  [runners.docker]
    ...
    volumes = ["maven-volume-for-runner:/.m2", ...]

Here we defined a named volume maven-volume-for-runner that will be a docker volume created if not existing on the docker host and which the folder on the container will be /.m2.
As alternative, we could use a bind mount such as :
"/etc/maven-repo-for-runner:/.m2"
From the pipeline side, we need to make our maven local repository to refer to /.m2 such as :

variables:
  MAVEN_OPTS: "-Dmaven.repo.local=/.m2"
 
Maven Package:
  stage: maven-build
  tags:
    - maven
  image: maven:3.6.3-jdk-11
  script:
    - mvn clean package

Maven build as JAR

todo

Maven build as Docker image

todo

Define external yaml gitlab pipelines

Main goals

– splitting a complex/long pipeline for a gitlab project into distinct readable and more maintainable parts
– avoid duplication for common jobs and global variables between distinct gitlab projects. pipeline for a project into distinct readable and more maintainable parts

Define a yaml to release maven projects (both in Git and in artifacts repository)

The intention :
The Maven release-prepare and release-perform plugins are both very picky and slow to build a release.
Actually, the only important thing that we need when we want to create a release for a Maven project is : building and running tests for the project, creating a tag for that version (allowing to rollback whenever), updating from snapshot to release the artifact version defined in the pom.xml, deploying the built artifact in a central dependency repository (Nexus or another), and at last updating with a new snapshot version, the artifact version defined in the pom.xml.
We could define that with a gitlab pipeline defined in a specific project.

maven-release.yml :

variables:
    mavenRelease_mustGenerateVersionProps: false
 
Maven Release:
  stage: git-tag-and-deploy-to-nexus
  tags:
    - maven
  image: maven:3.4.0-jdk-8
  cache:
    paths:
      - ${CI_PROJECT_DIR}/target/
    key: ${CI_COMMIT_REF_NAME}
  script:
     # GIT CLONE TO GET WRITE RIGHTS ON THE REPO
    - gitlabRepoUrl=$(echo $CI_PROJECT_URL | sed 's@http://@@' | xargs -I{} echo http://oauth2:${GITLAB_PIPELINE_TOKEN}@{}.git)    
    - git clone ${gitlabRepoUrl} repo-with-rights
    - cd repo-with-rights
    - git config user.email "gitlab pipeline"
    - git config user.name "gitlab pipeline"
    - git config push.default simple
    - git branch
     # CREATE AND PUSH THE TAG ON GIT
    - tagVersion=$(mvn $maven_cli_opts -q -U -Dexpression=project.version -DforceStdout org.apache.maven.plugins:maven-help-plugin:3.2.0:evaluate)
    - tagVersion=${tagVersion/-SNAPSHOT/}
    - echo "version used for tag and release ${tagVersion}" 
    - git tag -a $tagVersion -m "Tagged by gitlab pipeline"
    - git push origin $tagVersion
    # UPDATE THE POM WITHTOUT THE SNAPSHOT
    - mvn -U versions:set -DnewVersion=${tagVersion}
    - echo "mavenRelease_mustGenerateVersionProps=${mavenRelease_mustGenerateVersionProps}"
    # OPTIONAL : 
    - >
      if [[ "${mavenRelease_mustGenerateVersionProps}" = "true" ]]; then 
          echo "the version props was set with the tagVersion:$tagVersion"
          sed -i "s/version=__TOKEN__/version=${tagVersion}/" delivery-scripts/deploy.props 
      fi     
    # PUSH THE COMPONENT TO NEXUS
    - mvn ${maven_cli_opts} -U clean deploy
    # INCREMENT THE POM ARTIFACT VERSION AND PUSH ON GIT
    - git checkout ${CI_COMMIT_REF_NAME}
    - newVersion=$(mvn $maven_cli_opts -q -U -Dexpression=project.version -DforceStdout org.apache.maven.plugins:maven-help-plugin:3.2.0:evaluate)
    # increment the second digit and works for both 2 and 3 digit version
    - newVersion=${newVersion/-SNAPSHOT/}
    - newVersion=$(echo ${newVersion} |  awk -F'.' '{print $1"."$2+1"."$3}' |  sed s/[.]$//)
    - newVersion="${newVersion}-SNAPSHOT"
    - echo "newVersion=$newVersion"
    - mvn -U versions:set -DnewVersion=${newVersion}
    - git add pom.xml
    - git commit -m "update the next dev version to ${newVersion} by gitlab pipeline"
    - git push
    #$ mvn help:evaluate -q -DforceStdout -Dexpression=revision
  artifacts:
    name: ${CI_PROJECT_NAME}:${CI_COMMIT_REF_NAME}
    expire_in: 1 week
    paths:
      - ${CI_PROJECT_DIR}/target/*.jar
  when: manual       
  only:
    refs:
      - master

And here a gitlab project that includes that pipeline :
.gitlab-ci.yml :

include:
  - project: 'gitlab-common'
    file: '/maven-release.yml'
 
variables: 
  mavenRelease_mustGenerateVersionProps: "true"
 
stages:
  - build
  - git-tag-and-deploy-to-nexus
 
#...

yaml gitlab pipelines for dockerized spring boot app

With Gitlab and Docker, we have two ways of building a docker image designed to host a Java app built by Maven :
– Define a stage to build the maven project and a next stage using the maven build result to build the Docker java app image
– Define a stage that both builds the maven project and build the Docker java app image

Define a stage to build the maven project and a next stage using the maven build result to build the Docker java app image

Here the maven build stage :

Package with Maven:
  stage: maven-build
  variables:
    MAVEN_ARGS: "-s /usr/share/maven/conf/settings.xml --batch-mode"
    MAVEN_OPTS: "-Dmaven.repo.local=/root/.m2/repository"
  script:
    - echo { \"branch\":\"${CI_COMMIT_REF_NAME}\", \"commit\":\"${CI_COMMIT_SHA}\", \"pipelineId\":\"${CI_PIPELINE_ID}\"} > git-info.json
    - echo "git-info.json content =" && cat git-info.json
    - mv git-info.json ${CI_PROJECT_DIR}/src/main/resources
    - mvn $MAVEN_ARGS clean package
    # compute the the docker build version
    - >
      if [[ "${CI_COMMIT_REF_NAME}" = "master" ]]; then
          dockerImageVersion=$(mvn $MAVEN_ARGS -q -U -Dexpression=project.version -DforceStdout org.apache.maven.plugins:maven-help-plugin:3.2.0:evaluate)
          dockerImageVersion=${dockerImageVersion/-SNAPSHOT/}
      else
          dockerImageVersion=${CI_COMMIT_REF_NAME}
      fi
    - echo "dockerImageVersion=${dockerImageVersion}"
    - echo ${dockerImageVersion} > docker-image-version.json
  artifacts:
    name: ${CI_PROJECT_NAME}:${CI_COMMIT_REF_NAME}
    expire_in: 1 week
    paths:
      - ${CI_PROJECT_DIR}/target/*.jar
      - ${CI_PROJECT_DIR}/docker-image-version.json
  tags:
    - maven
  only:
    refs:
      - master
      - branches

Notes about that stage :
– that maven-build stage prepares both the jar and the version of the docker image based on the pom.xml.
– jar and the app version are stored as gitlab artifacts in specific paths that is the conventional way to share data between stages. – we store GIT branch/commit/pipeline info into a file. It is helpful for app version actuator or an about->version information.
– we cache the .m2/repository used during the build to prevent re-downloading artifacts at each job build

And here the docker build stage template:

variables:
  dockerBuild_dockerImage: ""  
  dockerBuild_relativeDirOfProject: "."
 
Build image with Docker:
  stage: docker-build
  variables: 
	DOCKER_REGISTRY: "foo-registry:..."
  before_script:
    - dockerImageVersion=$(cat docker-image-version.json | head -n 1)
    - >
      if [[ -z dockerImageVersion ]] ; then
        echo "FAIL : no dockerImageVersion file found" & false
      fi
    - >
      if [[ "${dockerBuild_dockerImage}" = "undefined" ]]; then
          echo "FAIL : dockerBuild_dockerImage is undefined. That variable has to be defined previously in the current pipeline" & false
      fi
    - docker --tlscacert=${DOCKER_TLSCACERT} login -u "${DOCKER_REGISTRY_LOGIN}" -p "${DOCKER_REGISTRY_TOKEN}" ${DOCKER_REGISTRY_URL}
    - cp ${dockerBuild_relativeDirOfProject}/target/*.jar docker/
  script:
    - docker build  --tag ${dockerBuild_dockerImage}:${dockerImageVersion} ./docker
    - docker tag ${dockerBuild_dockerImage}:${dockerImageVersion} ${DOCKER_REGISTRY}/${dockerBuild_dockerImage}:${dockerImageVersion}
    - docker push ${DOCKER_REGISTRY}/${dockerBuild_dockerImage}:${dockerImageVersion}
 
  when: always
  only:
    refs:
      - master

Ce contenu a été publié dans Non classé. Vous pouvez le mettre en favoris avec ce permalien.

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *