IBM Developer Blog

Follow the latest happenings with IBM Developer and stay in the know.

Explore the key features of Red Hat OpenShift Pipelines, and learn how you can execute a pipeline.


One of the interesting additions to Red Hat OpenShift 4 is OpenShift Pipelines, which I will introduce in this blog. I’ll start by reviewing DevOps, and then I will jump right into OpenShift Pipelines, explaining what it is and how to implement it.

DevOps review

DevOps, as the name indicates, is an approach that encourages collaboration between developers(dev) and operations(ops) teams. With the DevOps approach, working on projects becomes much easier because it promotes agile and lean software delivery between lines of business.

The CI/CD pipeline is the backbone of the DevOps process, and it is an iterative process that consists of the following:

  • Continuous integration (CI): Continuous integration focuses on coding, building, integrating, and testing.
  • Continuous deployment or continuous delivery (CD): Continuous deployment focuses on automating the releases safely as soon as possible like bug fixes and adding new features, while continuous delivery can include CI but mainly focuses on product releases.

The process of the CI/CD pipeline starts with making changes and pushing them into the repository, then building and testing code, and finally reviewing, deploying, and delivering to users. This is an iterative process that happens whenever changes and updates are made.

The DevOps approach proves to be successful because it improves collaboration between teams, which makes it faster to make changes and fix bugs, so that you can deploy your applications within minutes. With the DevOps approach, teams are able to deploy with confidence by automating tasks and making sure the same steps are executed automatically. Finally, your team can leverage the automated testing before deploying code to a specific environment.

Cloud-native DevOps

When you hear the term cloud-native DevOps, you might think at first that it is cloud-based, but in fact the term refers to taking advantage of automation and scalability offered by containers and Kubernetes.

Cloud-native DevOps involves continuous improvement, automation, cross-functional teams, and better alignment with business needs with customer expectations in mind.

Despite it being called cloud-native, projects don’t necessarily need to be deployed on the cloud — they can also be deployed on-premises or on virtual servers. It is more about the principles and process, not where the project is. Characteristics of cloud-native DevOps include:

  • Each task in the pipeline has its own lifecycle (in other words, when executed, it runs as its own container)
  • Built for container applications and run on Kubernetes
  • Designed with microservices and distributed teams in mind

One of the famous tools for building cloud-native pipelines is Tekton. Tekton is an open source framework for Kubernetes that aims to build cloud-native CI/CD pipelines quickly. With Tekton, you can deploy your projects across multiple cloud providers or hybrid environments easily.

OpenShift Pipelines

OpenShift Pipelines is an operator that is based on Tekton to build Kubernetes-style CI/CD. With OpenShift Pipelines, you can run each step of the CI/CD pipeline in its own container. You can also scale each step of the pipeline independently to meet the demands of the pipeline. OpenShift Pipelines builds on the Tekton building blocks and provides a CI/CD experience through tight integration with OpenShift and Red Hat developer tools.

OpenShift Pipelines has several features, such as:

  • Kubernetes style pipelines: Create pipelines using standard Kubernetes Custom Resource Definitions (CRDs) that are portable across Kubernetes distributions.
  • Runs serverless: Create and run pipelines without the need for a CI/CD server to manage or maintain.
  • Deploys to multiple platforms: Your pipelines run on Kubernetes, but you can deploy to many Kubernetes, VMs, and serverless platforms from the pipeline.
  • Builds images with Kubernetes tools: You can build images with the tools like Source-to-Image (S2I), Buildah and Dockerfiles, Jib, Kaniko, and more.
  • Developer tools: You can interact with pipelines using the CLI tools and integrate with the OpenShift developer console and IDE plugins.

The following diagram shows the components of Tekton pipelines that are the same in OpenShift pipelines. As the diagram shows, a pipeline consists of one or more tasks that should be performed. A task is number of steps that should be performed like building a container image or pushing changes to the project, and it can be reusable. PipelineResource is the inputs and outputs of a pipeline or a task. When you execute an instance of the pipeline, you get a PipelineRun, which consists of a number of TaskRuns.

Process of CI/CD pipeline

To create a pipeline, you need to perform the following steps:

  • Create or install tasks.
  • Create a pipeline and PipelineResources to define your application’s delivery pipeline.
  • Create a PipelineRun to instantiate and invoke the pipeline.
  • Tekton delivery pipelines are created within YAML files that define pipelines as a set of Kubernetes resources. You can edit those YAML files to change the behavior of a pipeline. The following is a snippet of a YAML file of a simple pipeline.
apiVersion: tekton.dev/v1beta1
kind: Pipeline
metadata:
  name: mypipeline

  tasks:
  - name: task1
    taskRef:
      name: task1
    workspaces:
    - name: source
      workspace: shared-workspace
  - name: task2
    taskRef:
      name: task2
    workspaces:
    - name: source
      workspace: shared-workspace
    params:
    - name: deployment
      value: $(params.deployment-name)
    - name: IMAGE
      value: $(params.IMAGE)
    runAfter:
    - task1

The code snippet above is an example of what a CI/CD pipeline looks like. In the pipeline, you are referring to the tasks that you have installed in the namespace using taskRef and providing information like name, parameters, and resources (input and output of each task). runAfter refers to the task that should run before the current one.

You can write your own YAML file or generate it when you create your application to build your own pipeline. When you build the pipeline, the pipeline overview will look like the following:

Process of CI/CD pipeline

Once the pipeline is created, you can trigger it to execute the tasks specified in the pipeline. If you would like to explore hands-on with OpenShift Pipelines, I highly recommend the Build a CI/CD Tekton Pipeline for deploying a Node.js application tutorial for OpenShift.

Summary

In this blog post, you learned about, DevOps, cloud-native DevOps, and features of OpenShift Pipelines, including its components and how a pipeline gets executed.

This marks the end of this blog series but the beginning of a fruitful journey with OpenShift. In this series, you learned the basic concepts of OpenShift 4 around architecture, operators, the web console, and pipelines. Of course, there’s much more to OpenShift, and I hope this series paves the path for you to the world of cloud-native with OpenShift. From here, take a look at Red Hat OpenShift on IBM Cloud for more resources to help you on your path.