Companies often find it hard to deliver software to customers due to excessive manual labor or the lack of automation and consistency in their operations. This is where continuous integration (CI) and continuous delivery (CD) practices can make your work a little bit easier.
The IBM Cloud Continuous Delivery service has two types of delivery pipelines: classic and Tekton. The classic delivery pipelines are created graphically, with the status embedded in the pipeline diagram. On the other side, the Tekton delivery pipelines are created within YAML files that define pipelines and their dependencies as a set of Kubernetes resources.
This tutorial provides you with a quick start on how to use Tekton pipelines to deploy some serverless IBM Cloud Functions actions. As a bonus, you use IBM Cloud Schematics automation to create some resources that are necessary for the actions to work.
Tekton is an open source framework for creating CI/CD systems designed to run on top of Kubernetes, using all of its advantages. Moreover, it provides you with the following benefits:
- Flexibility. Tekton entities are fully customizable, allowing you to define a catalog of building blocks for your development team to use in a wide variety of scenarios.
- Reusability. Since Tekton resources are defined in portable YAML files, anyone within your organization can use a defined pipeline or a set of tasks from different pipelines. This prevents duplication.
- Expandability. The Tekton Catalog is a community-driven repository; you can create or expand your pipelines based on catalog components.
At the time of writing this tutorial, Tekton consists of the following components:
Basic building blocks of a CI/CD workflow.
Event triggers for the workflow.
Command-line interface to interact with the workflow.
Web-based UI for the pipelines.
Repository with Tekton building blocks such as Tasks and Pipelines.
Web-based graphical interfaces for accessing the Tekton Catalog.
Kubernetes Operator that you can use to install, update, and remove Tekton projects on your Kubernetes cluster
To follow this tutorial, you must be familiar with the following Tekton concepts:
- A Pipeline is a collection of tasks; each task in a pipeline can use the output of a previously execute task.
- A Task is a collection of steps; each step invokes a specific build tool by using a set of inputs and produces a set of outputs that the next step may use.
On Tekton, you can run your pipelines by using a resource called PipelineRun; this instigates your pipeline to run all of its tasks. On IBM Cloud, you run your pipelines by using Triggers. Instead of defining a PipelineRun by itself, you define the following resources for your pipeline to run when an event occurs:
Custom resource that can template other resources; parameters can be substituted within the resource template.
Custom resource that binds against the incoming events. It allows you to capture fields from the event to store them as parameters, and apply them on the Trigger Template.
Custom resources that allows the pipeline to process incoming HTTP events with JSON payloads.
So, how does this work? Your EventListener receives the incoming events, the TriggerBinding gets the parameters from the received event, and uses the TriggerTemplate with the event parameters to create the PipelineRun.
- An IBM Cloud account. If you don’t already have an account, you can easily register for one.
- A GitHub account.
You should be able to complete this tutorial within 30 minutes.
- Fork the GitHub repository
- Deploy the resources with IBM Cloud Schematics
- Create and configure the IBM Cloud Continuous Delivery service
1. Fork the GitHub repository
Go to the GitHub repository for this tutorial and click Fork to copy the code to your own GitHub account.
The GitHub repository has the following structure:
Folder that includes the files required for the pipeline to work.
Parent folder for the CRUD actions for a sample movie tickets database (create, update, list, and delete).
Folder that includes the Terraform template that you deploy on IBM Cloud Schematics.
2. Deploy the resources with IBM Cloud Schematics
First, you must deploy some resources for the actions to work properly. To do this, you have to deploy a Terraform template with IBM Cloud Schematics. This Terraform template creates:
- IBM Cloudant instance and database
- Two resource keys: a manager key that IBM Cloud Schematics needs to create the database, and a writer key for the functions to be able to work with the database
- IBM Cloud Functions package to group the actions that you deploy
- A binding to the Cloudant database to use the credentials as default parameters in the actions
Follow these steps to deploy the Terraform template with IBM Cloud Schematics:
Click Create workspace.
To import the Terraform template, enter the URL path to the
Terraformfolder within the GitHub repository fork that you created in Step 1 into the GitHub, GitLab or Bitbucket repository URL field.
If your repository is private, enter your personal access token in the corresponding field.
From the Terraform version list, select
Click Save template information.
Enter the following parameter values based on your own IBM Cloud account information and then click Save.
||Add your IBM Cloud API Key or create a new one by following the Creating an API key instructions.|
||Enter any prefix for your resources.|
||Add the name of your current IBM Cloud Functions namespace. If you don’t have one, follow the Creating an IAM-based namespace instructions.|
||Add any value|
||Enter your Cloud Foundry organization name or create a new one by following the Creating orgs in the console instructions.|
||Enter your Cloud Foundry organization space (subgroup) or create a new oneby following the Creating spaces in the console instructions.|
Go to the Settings tab and click Generate plan. This will help you to understand what is going to be created, updated, or deleted.
When you see that a new activity is executing on your workspace Activity page, click View log to get more information.
In the logs, you should see that four resources will be created, as the following screen capture demonstrates.
When you return to the Activity list, click Apply Plan to start creating the resources. When the creation is complete, you should see in the logs that they are done.
3. Create and configure IBM Cloud Continuous Delivery
Have you previously experienced a situation where you had a single code repository with different components within it, and you only want to build, test, or deploy them according to the changes they have instead of doing everything repeatedly per any change? Executing the whole build all over again can be time-consuming and may not be necessary.
In this tutorial scenario, you have 4 Node.js functions ready to be deployed on IBM Cloud Functions. But you don’t want to build and deploy all of them all of the time. So, this pipeline checks for code changes in each folder before building and deploying.
This pipeline contains several tasks, including repository clone, build, and deploy, which are described in the following sections.
The repository clone task performs Git cloning to bring the code to the workspace, which makes the code available to the next tasks.
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: repository-clone spec: workspaces: - name: task-workspace mountPath: /working params: - name: app-dirs - name: git-repository - name: git-branch steps: - name: execute-script image: ibmcom/pipeline-base-image:2.9 envFrom: - configMapRef: name: environment-properties - secretRef: name: secure-properties env: - name: appdirs value: $(params.app-dirs) - name: GIT_BRANCH value: $(params.git-branch) - name: GIT_REPO value: $(params.git-repository) command: ["/bin/bash", "-c"] args: - | cd /working echo "Cloning git repository" # get the right repo and branch if [ -z $GIT_BRANCH ]; then git clone -q $GIT_REPO . else git clone -q -b $GIT_BRANCH $GIT_REPO . fi
The build task only performs the build by using webpack. Notice that it receives two parameters:
APP_BASE, which should be the base folder of the functions, and
APP_FOLDER, which is the folder of the functions itself. The task validates if the folder was changed by using the
git log command. If the folder was changed, the task performs the build and exports a new environment variable with the true value. When the folder is not changed, the variable is set to
false. Finally, that variable is written to the output variable of
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: build-function spec: results: - name: is_changed description: app has changed workspaces: - name: source mountPath: /working params: - name: app-base - name: app-folder steps: - name: execute-script image: ibmcom/pipeline-base-image:2.9 envFrom: - configMapRef: name: environment-properties - secretRef: name: secure-properties env: - name: APP_BASE value: $(params.app-base) - name: APP_FOLDER value: $(params.app-folder) command: ["/bin/bash", "-c"] args: - | cd /working if git log --format= -n 1 --name-only | grep -qw $APP_FOLDER; then echo "Folder changed" echo "Installing dependencies" cd $APP_BASE/$APP_FOLDER npm install echo "Building the function using webpack" npm run build export CHANGED_FOLDER=true else echo "Folder not changed" export CHANGED_FOLDER=false fi printf $CHANGED_FOLDER | tee $(results.is_changed.path)
The deploy task logs in to
ibmcloud through the CLI and executes the deployment script, which deploys the webpack build to IBM Cloud Functions.
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: deploy-app spec: workspaces: - name: task-workspace mountPath: /working params: - name: target-region - name: package-name - name: function-name - name: space - name: organization - name: app-base - name: app-folder steps: - name: execute-script image: ibmcom/pipeline-base-image:2.9 envFrom: - configMapRef: name: environment-properties - secretRef: name: secure-properties env: - name: REGION value: $(params.target-region) - name: PACKAGE_NAME value: $(params.package-name) - name: FUNCTION_NAME value: $(params.function-name) - name: ORG value: $(params.organization) - name: SPACE value: $(params.space) - name: APP_BASE value: $(params.app-base) - name: APP_FOLDER value: $(params.app-folder) - name: PIPELINE_APIKEY valueFrom: secretKeyRef: name: secure-properties key: apikey command: ["/bin/bash", "-c"] args: - | cd /working echo "Deploying to IBM Functions" ibmcloud login -a cloud.ibm.com -r $REGION -o $ORG -s $SPACE --apikey $PIPELINE_APIKEY source ./scripts/deploy-function.sh
If you check the complete pipeline file, you see that it contains many tasks but references the tasks in the tasks file. For example, for the
create-functions set of tasks, the
build-create task is based on the
build-function task explained earlier. The only difference is that, in this file, you assign values to the parameters and conditions for those tasks to run.
- name: build-create taskRef: name: build-function runAfter: - git-repo-clone params: - name: app-base value: "functions" - name: app-folder value: "create-tickets" workspaces: - name: source workspace: pipeline-ws
deploy-create task is based on the
deploy-app task, but it only runs when the
build-create task of the
is_changed pipeline variable is
true. So, if the
git log command doesn’t show that the folder changed, this task will not execute.
- name: deploy-create taskRef: name: deploy-app when: - input: $(tasks.build-create.results.is_changed) operator: in values: ["true"] workspaces: - name: task-workspace workspace: pipeline-ws params: - name: package-name value: $(params.package-name) - name: function-name value: "create-tickets" - name: target-region value: $(params.region) - name: space value: $(params.space) - name: organization value: $(params.organization) - name: app-base value: "functions" - name: app-folder value: "create-tickets"
To configure the pipeline, follow these steps:
Open the IBM Cloud DevOps toolchains dashboard.
From the Location list, select the region where you want to create your toolchain and then click Create toolchain.
On the Create a Toolchain page, select the Build your own toolchain option.
Enter the name for your toolchain and then click Create.
On your toolchain page, click Add Tool.
On the Add tool integration page, find and select the
On the Configure GitHub page, update the fields with the specifications of your GitHub repository and click Create Integration.
Back on your toolchain page, click Add tool again.
On the Add tool integration page, find and select the
On the Configure Delivery Pipeline page, enter a name for the delivery pipeline, select Tekton from the Pipeline type list, and click Create Integration.
Back on your toolchain page, click Delivery Pipeline to configure the Tekton pipeline.
To configure the Tekton pipeline, you must first add its definitions. Click Add, select your configured repository, and enter the path to the definition files (
Look at the content of the files. If you are comfortable with them, click Save.
Go to the Triggers section and click Add trigger.
Select the defined EventListener and Worker.
In the Environment properties section, you must configure the following properties:
||Secure||API key for the toolchain to deploy on IBM Cloud Functions|
||Text||Git branch to clone|
||Text||Git repository to clone|
||Text||IBM Cloud organization|
||Text||Package name to deploy the actions (must be same as what you defined within IBM Cloud Schematics in Step 2)|
||Text||IBM Cloud region|
||Text||IBM Cloud space|
Click Run Pipeline. You should see all of the properties that you previously defined. Click Run to start the pipeline.
After a couple of seconds, you should see a new entry in the PipelineRuns section. Click that entry to view its status. You should see something similar to the following screen capture image.
Go to the IBM Cloud Functions Actions page to verify that the actions deployed.
Congratulations, you just deployed a serverless API with Tekton! In this tutorial, you learned how to configure a DevOps pipeline by using Tekton to deploy a serverless API in a monorepo. You also took advantage of IBM Cloud Schematics to automate the creation of cloud resources. Repetitive tasks such as application deployment and cloud infrastructure setup can become a burden in your daily tasks if these are not automated. The capabilities provided by Tekton and Terraform can aid you in your transformation to the DevOps culture.
Now that you already have an application pipeline, you can go further and create an infrastructure pipeline as well. Managing your infrastructure within a pipeline will help you to establish a standardized workflow to test and validate your infrastructure changes before promoting them to production.