Overview

Skill Level: Any Skill Level

Co-authored with Kevin Trinh (ktrinh@us.ibm.com) Cloud Integration Architect. This recipe outlines how to set up a continuous deployment pipeline to automatically deploy IBM FileNet Content Platform Engine (CPE) containers to Openshift platform.

Ingredients

Step-by-step

  1. Background

    Continuous integration (CI) and continuous delivery or deployment (CD) is a method to frequently deliver software to customers by introducing automation into the stages of app development and deployment. A mature CI/CD practice has the option of implementing continuous deployment where application changes run through the CI/CD pipeline and passing builds are deployed directly to production environments.

    Openshift is a certified kubernetes cluster platform which allows you to create and manage your containers. In this recipe, we’ll introduce you how to set up the continuous deployment pipeline to automatically deploy IBM CPE containers to Openshift platform.

  2. Pipeline overview

    Here is the whole picture view of the CD pipeline, it starts with humans download target PPA package from IBM Password Advantage and put it on the Jenkins slave, then start the Jenkins pipeline. The pipeline involves deploying the images to development environment, staging environment and finally production environment.

    pipeline overview

    It is pretty similar to set up the pipeline for different environments, so in this recipe we take CPE development environment as an example, will introduce how to set up the pipeline in detail and skip staging and production environment.

    Here is the design of this pipeline, involves uploading images, deployment and then test. we’ll show you how to development this pipeline.

    1.-pipeline-design

  3. Step1: Create Jenkins credential

    The Jenkins server needs a client work as Jenkins slave to do the real work, in order to access the client, Jenkins needs to know the access account and password, we save it in Jenkins Credential.

    You have to be Jenkins admin to do this configuration. On the Jenkins dashboard, go to Manage Jenkins. Click on Credentials then System link, on the displayed right panel, click Global credentials (unrestricted), you’ll see all existing credentials listed, on the left side, click Add Credentials.

    Input the user name and password of target client, ID and Description is optional but suggest you set it to easy recognize your credential.

    1.-add-credentail

  4. Step2: Manage Jenkins nodes

    Jenkins nodes is also known as Jenkins slave, it is where the task is actually executed. Here we will set our OCP cluster as a Jenkins node, but of course you can set another client to run Jenkins job, just add one more step to login OCP cluster first.

    You have to be Jenkins admin to do this configuration. On the Jenkins dashboard, go to Manage Jenkins. Click on Manage Nodes link, then click New Node from the left sidebar. Input a node name and choose Permanent Agent. Take below screenshot as an example to add a node named “ocp-cluster”.

    For the Labels filed, we set it as “ocp”, Jenkins job will use this value to find this cluster. In the Credentials field, we use the credential created in step1.

    2.-add-slave

  5. Step3: Slack integration

    In this step we’ll create a fresh Slack channel and add Jenkins integration.

    First let’s create the channel for test purpose only, on Slack App, click “+” icon from left side:

    3.-create-channel

    Second on the new created channel, click Add app from channel details panel or Add an app from the central panel.

    4.-new-created-channel

    On the opened the list search Jenkins CI and click to view it, then click Settings.

    5.-add-Jenkins-app

    You’ll see the opened page in your browser about Jenkins CI, click Add Configuration.

    6.-Jenkins-CI-in-browser

    Then filter your target channel from the channel list, here we select #gyfguo-test.

    7.-Jenkins-CI-configuration

    After clicking Add Jenkins CI integration, you’ll see the Jenkins CI setup guide, record the generated token on this page, then following the instruction to input it on the Jenkins configuration like below:

    8.-Jenkins-global-Slack-configuration

    On this setting, we create another Jenkins credential, but the type should be Secret text. Input the token we get from last step to Secret field.

    9.-Add-Jenkins-Slack-integration-token
    After the setting, you can click TEST CONNETION button and if everything works well, you’ll see a Success text and get a Slack message post in target channel.

    10.-Jenkins-Slack-test-connection

    Below is the test message we got in our Slack channel.

    11.-Slack-all-set-message

    Until now Slack is all set. #gyfguo-test will be set as default channel for Jenkins, but of course you can post messages to a different channel, just set the channel name in you Jenkins job, it will overwrite this global configuration.

  6. Step4: Create jobs

    We are going to need three jobs, to get a clean view and separate them from the other jobs on the same Jenkins server, let’s create a new list view named “ocp-deployment” to contains all the jobs we need.

    13.-oc-deployment-view

    Now under the new view, click New Item then choose Freestyle project to create three new freestyle jobs, it is fine to set the just set a job name, we’ll edit them later. Below are the three jobs we created.

    14.-oc-deployment-view-list

  7. Step5: Edit job - upload images

    This job will be used to upload images to OCP internal registry, then pass the image tag to next job to start deployment.

    We need below three files to upload the images, put them together on the /jenkins/ppa folder of the OCP server.

    1. The pre-downloaded PPA package DBAMC19.0.1-ecm.tgz, it contains CPE docker image;
    2. loadimages.sh script from Github, it is used to extract and upload the images from above PPA package;
    3. get-image-tag.sh, it is used to extract the image tag and write it into a file, so we can pass the tag to next deployment job. Here is the content of this file:
    #!/bin/bash

    unset ppa_file
    ppa_file=$1
    echo "Extracting image name and tags from "$ppa_file
    tar -zxvf $ppa_file manifest.json

    arr_img_name=($(grep -w "image" manifest.json | awk '{print $2}' | sed 's/\"//g' | sed 's/\,//g'))
    arr_img_tag=($(grep -w "tag" manifest.json | awk '{print $2}' | sed 's/\"//g' | sed 's/\,//g'))
    versionfile="./image_tag.txt"
    if [ ! -f "$versionfile" ]; then
    touch "$versionfile"
    else
    rm -rf "$versionfile"
    touch "$versionfile"
    fi

    index=0
    let image_length=${#arr_img_name[@]}
    for image in ${arr_img_name[@]}

    do
    echo $image"_tag=${arr_img_tag[$index]}"
    echo $image"_tag=${arr_img_tag[$index]}" >> $versionfile
    let index++
    done

    rm -rf manifest.json

    Now back to Jenkins UI, click job upload-images then Configure from the left sidebar, below are the special configurations:

    1. On General section, check This project is parameterized and add three String Parameters。

    12.2.-upload-image-job-general

    Then check Restrict where this project can be run, and for Label Expression field, input ocp, this defines the job can only run on our previously created ocp node.

    2. On Build section, click Add build step button to add Execute shell, here we login OCP internal registry first, then call our script to upload the images. At last get-image-tag.sh will write the image tag into /jenkins/ppa/image_tag.txt .

    12.2.-upload-image-job-build

    3. On the last Post-build Actions section, click Add post-build action button to add Trigger parameterized build on other projects, trigger project cpe-deployment and pass /jenkins/ppa/image_tag.txt, we also need to pass all the parameters defined in this job so they can be used later.

    12.-upload-image-job-post-build 

     

    4. In order to be notified when this job finished, we can click Add post-build action button again to add Slack Notification. Since we already have our channel configured in Jenkins global configuration, here just check which notification we want is enough. If you want to post to a different channel, it can be configurated by clicking Advanced button.
    12.-upload-image-job-Slack

     

  8. Step6: Edit job - deploy ECM Containers

    This job will run helm install command to install CPE helm chart. Same with upload-images job, let edit cpe-deployment and add below configurations:

    1. On General section, we also need to add three parameters and set ocp label to restrict this job can only run on our ocp node.

    The name of the parameters have to match what is defined in previous job. Value of cpe_tag got from /jenkins/ppa/image_tag.txt, so its name have to match the key in image_tag.txt. Although all values got from previous job, we have to define them here otherwise they can’t be recoginized. Default value can be left as blank, but a value here allows this job to run independently, which is useful for debugging purpose.

    15.2.-cpe-deployment-general

    2. On Build section, we just switch to /jenkins/ppa folder and then call cpe-helm-deploy.sh script. This script will execute helm install and write the CPE server url into /jenkins/ppa/cpe_link.txt

    15.-cpe-deployment-build

    Below is the content of cpe-helm-deploy.sh:

    !/bin/bash
    :<<!
    When executing, this file need below parameters:
    1. helm chart $helm_chart
    2. image tag $cpe_tag
    3. docker registry $docker_regsitry
    For example: ./cpe-helm-deploy.sh ibm-dba-contentservices-3.0.0.tgz ga-553-p8cpe 172.30.1.1:5000/dbamc
    !

    oc login -u system:admin
    export TILLER_NAMESPACE=tiller
    oc policy add-role-to-user edit "system:serviceaccount:${TILLER_NAMESPACE}:tiller"
    oc adm policy add-role-to-user edit dbamc -n kube-system

    oc login -u dbamc -n dbamc

    helm list | grep dbamc-cpe
    if [ $? -eq 0 ]; then
    helm delete --purge dbamc-cpe
    fi

    helm install $1 --name dbamc-cpe --namespace dbamc --set cpeProductionSetting.license=accept,cpeProductionSetting.jvmHeapXms=512,cpeProductionSetting.jvmHeapXmx=1024,cpeProductionSetting.licenseModel=FNCM.CU,dataVolume.existingPVCforCPECfgstore=cpe-cfgstore,dataVolume.existingPVCforCPELogstore=cpe-logstore,dataVolume.existingPVCforFilestore=cpe-filestore,dataVolume.existingPVCforICMrulestore=cpe-icmrulesstore,dataVolume.existingPVCforTextextstore=cpe-textextstore,dataVolume.existingPVCforBootstrapstore=cpe-bootstrapstore,dataVolume.existingPVCforFNLogstore=cpe-fnlogstore,autoscaling.enabled=False,resources.requests.cpu=1,replicaCount=1,image.repository=$3/cpe,image.tag=$2,cpeProductionSetting.gcdJNDIName=FNGCDDS,cpeProductionSetting.gcdJNDIXAName=FNGCDDSXA

    export NODE_PORT=$(kubectl get --namespace dbamc -o jsonpath="{.spec.ports[0].nodePort}" services dbamc-cpe-ibm-dba-contentservices)
    export NODE_IP=$(kubectl get nodes --namespace dbamc -o jsonpath="{.items[0].status.addresses[0].address}")
    echo Deployed Content Platform Engine service can be visited by http://$NODE_IP:$NODE_PORT/acce

    #record the link into ./image_tag.txt
    cpe_link="./cpe_link.txt"
    if [ ! -f "$cpe_link" ]; then
    touch "$cpe_link"
    else
    rm -rf "$cpe_link"
    touch "$cpe_link"
    fi
    echo cpe_link=http://$NODE_IP:$NODE_PORT/acce > $cpe_link

    3. On the last Post-build Actions section, trigger cpe-test and pass two txt files, /jenkins/ppa/image_tag.txt,/jenkins/ppa/cpe_link.txt. We also add Slack notification here.

    15.-cpe-deployment-post-build

     

  9. Step7: Edit job - Acceptance Tests

    After we deploy the container images, there are some required steps to get the IBM Content Platform Engine up and running, see Manually initializing and verifying your content services environment, there are public API to do this so they can be part of the acceptance tests, it is not in the scope of this recipe.

    We can also use the same acceptance test suite for development/staging/production environment. Here we just add a placeholder as an example, no real case executed. Below is what we configured:

    1. On General section, add two parameters, their value will get from the txt files passed from previous job.
      16.2.-cpe-test-general
    2. On Build section, we just add a shell example here, there are many other available build triggers here, like Python, Ansible, Gradle and etc, depending on what plugin you installed for your Jenkins server. You can choose according to your test case needs.
      16.-cpe-test-build
    3. On last Post-build Actions, no following job to trigger in our example here, but in a real situation you may want to trigger your next staging deployment, it easy to do so like the steps we did above. Here we just add a Slack notification with a custom message about deployed CPE link.
      16.-cpe-test-post-build
  10. Step8: Pipeline execute and view the pipeline

    Until now we’ve finished all job configuration and able to launch the pipeline. But it is not convenient to view by job list as it can’t show job sequence. So let’s create a new ocp_pipeline_view and set the type as Build Pipeline View.

    17.-pipeline-view

    On the opened configuration page, select upload-images as initial job and set displayed build as 5, so it can display the history of 5 builds.

    17.-pipeline-view-setting

    Now we get a pipeline like this:

    17.-pipeline-view-view

    On the top of this page there several buttons you can run or reconfigure the pipeline. Now let’s click Run to start the pipeline. Since we have several parameters defined, you’ll see a screen like this:

    17.2.-pipeline-view-start

    Confirm the parameters then click the BUILD button. Now the pipeline view changed to below, you can easily see which job is running, the not run is in blue and failed job is highlighted in red.

    17.2.-pipeline-view

    You can view the log output easily by clicking the console icon:

    17.2.-pipeline-view-console

    And after execution, you can see the message output from Slack channel like below.

    18.-slack-message

  11. Summarize

    This recipe gives an example on creating Jenkins CD pipeline to continuously deploy IBM CPE to Openshift environment, hope it can be a reference for you to set up your own CD pipeline. 

    There are also some team memebers helped with this recipe and valuable comments provided, here thanks their great support: Lian Xiao Jian, Bai Song, Sravan Ananthula and Thuyanh Nguyen!

Join The Discussion