Skill Level: Any Skill Level

It is recommended that you read the entire recipe before installing software or making changes to your systems.

In this tutorial, we will use a Docker image to show you how you can configure the Elastic Stack (Elasticsearch, Logstash, and Kibana, or ELK) to analyze IMS Connect event data supplied by the IMS Connect Extensions feed.


To experiment with the Elastic Stack and our sample data, simply complete steps 1 and 2.

To live-stream IMS Connect transaction performance data from your own systems (steps 3 and 4), you will need:

For general information about the Elastic Stack, see https://www.elastic.co/products/.


  1. Download and configure the sebp/elk Docker image containing the Elastic Stack

    In this tutorial, we will build a Docker image based on the commonly used “sebp/elk” image (available on Docker Hub) which conveniently packages Elasticsearch, Logstash, and Kibana (also known as the ELK stack). To do this, we first need to install Docker itself. To do this, navigate to https://www.docker.com/products/docker-desktop to download and install the product.

    Next, we will prepare a configuration file for Logstash.  The configuration file tells Logstash how to ingest the data coming in from the IMS Connect Extensions feed. To do this, create a new file named “10-cex.conf” with the following contents:


    input {
      tcp {
        port => 5044
        codec => json_lines
    filter {
      date {
      match => ["time", "ISO8601"]  }
    output {
      elasticsearch {
      hosts => ["localhost"]  index => "cex-%{type}-%{+YYYY.MM.dd}"
      # The following manage_template setting assumes that
      # you have created an index template for the cex-* index pattern.
      manage_template => false

    We wont go into all the details here, but in a nutshell, the configuration file above tells Logstash to listen for data in JSON lines format on port 5044 and that timestamps are in ISO 8601 format. If you would like to learn more about the other settings shown here, please visit https://www.elastic.co/guide/en/logstash/current/configuration.html.

    Next, we will create a Dockerfile that will be used to construct our new Docker image. To do this, create a new file named “Dockerfile” (no extension) with the following contents:


    FROM sebp/elk
    # Remove all existing pipeline configuration files from sebp/elk
    RUN rm /etc/logstash/conf.d/*
    # Add the pipeline configuration file for the IMS Connect Extensions feed
    ADD 10-cex.conf /etc/logstash/conf.d/10-cex.conf 

    The Dockerfile above will instruct Docker to create an image based on sebp/elk, to remove its default configuration files, and to add our new configuration file, 10-cex.conf.

    Now, let’s download the sebp/elk image from Docker Hub. To do this, use a command prompt (for example, Windows users could use Powershell) and enter the following command (without the leading > symbol of course):

    > docker pull sebp/elk

    It is quite large, so you may need to wait a while.

    After download is complete, we are ready to create the new image. To do this, use a command prompt again to navigate to the folder containing the files you created and enter the following command:

    > docker build -t cex/elk .

    Docker will load your Dockerfile and then use the sebp/elk image you pulled from Docker Hub to create the image. The -t option allows us to give the image a name, in this case “cex/elk”. After it completes, enter the following command to see the results:

    > docker images


    You should be able to see two images: sebp/elk (our download), and cex/elk (the one we just created).

    Now we will create and start a container based on our new image. To do this, we will enter the following command:

    > docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -it --name elk cex/elk


    What do the parameters mean?

    The -p parameter signifies the ports that are to be used and how ports inside the container are mapped to ports on your local machine. This example uses the following ports:

    5601 (Kibana web interface)
    9200 (Elasticsearch)
    5044 (Logstash – note that the port matches what we specified in our 10-cex.conf file)

    The –name parameter provides us with a short name that we can use to refer to our container, in this case “elk”.

    For more information on these parameters, see https://elk-docker.readthedocs.io/#usage


    It may take a minute or two for the container to start. 

    While we are waiting, we can optionally open a shell into the Docker container to verify that our the configuration file is in the right place. To do this, type the following (you may need to open a new command shell):

    > docker exec -it elk /bin/bash

    > cd /etc/logstash/conf.d/

    > ls

    You should see “10-cex.conf” in the folder. To exit the shell, type:

    > exit

    You can start working with Kibana by opening a web browser and navigating to localhost:5601. If you browse to this location and you receive a message that Kibana has not yet started, simply wait for a moment and try again. If you see the Kibana main screen, you are now ready to import some data and create a visualisation.

    Tip: You can check on the status of your docker container by typing docker ps. You can also see what is happening in the logs by typing docker logs elk. You can also start and stop docker containers using the start and stop commands. For a complete list of commands, see https://docs.docker.com/engine/reference/commandline/docker/.

  2. Import and visualize some IMS Connect Extensions sample data using Kibana

    To start working in Kibana, we will import some sample data. To do this, use a simple text editor (such as Notepad if you are in Windows) to create a new file named “cex-sample.json” containing the following sample data. In this sample of 5 event records, you can see the format of the data that is supplied by the IMS Connect Extensions feed.



    What do the fields mean?

    For a complete description of the fields supplied by the IMS Connect Extensions feed, see https://www.ibm.com/support/knowledgecenter/en/SSAVHV_3.1.0/cexu-ca20-fields.html

    After you have created the .json file, return to Kibana (http://localhost:5601) to start the import. From the Navigation pane (located on the left-hand side of the main Kibana interface), select Machine Learning. In the Import data section, select Upload File.


    Select “cex-sample.json”. After analysis of the data is complete, the Data Visualizer will summarize the results for you.


    Scroll down and click Import.


    The Import data panel is displayed. In the Index name field, type “cex-sample”. Check that Create index pattern is selected, and then click Import.


    In the Navigation pane, select Vizualize. Click Create new visualization.


    Select a visualization type, for example, a Line chart, and then select “cex-sample” as your index data source.

    You can now use the panel on the left-hand side to select your X and Y axis values. For example, to plot a timeline of IMS Connect events and their response times:

    1. In the Metrics panel, select Y-axis with an Aggregation value of Average and a Field value of resptime.
    2. In the Buckets panel, select X-axis with an Aggregation value of Date Histogram and Field value of @timestamp.
    3. Click Apply Changes (the blue and white “play” button at the top of the panel).
    4. Select a date range (top right corner) that encapsulates the date range of our sample data, for example, from 2020-02-05 14:20:25.200 to 2020-02-05 14:20:25.600.

    In the image below, you can see this configuration. Notice that there are 5 points on the line; these correspond to our five lines of .json sample data.


    You can also create dashboards with multiple charts by selecting the Dashboard option (accessible via the Navigation pane). In the following example, you can see a much larger set of sample data and what is possible using a combination of different chart types and fields available via the IMS Connect Extensions feed. To see the real benefit, we need to pull in some real data.


    Tip: To remove the sample data from the ELK stack, go to the Navigation pane, select Management, select Index Management. Select “cex-sample”, click “Manage index“, and then click Delete Index.

  3. Configure Elasticsearch with an index template

    If we want live data from the IMS Connect Extensions feed, we need to complete our setup of the Elastic Stack.

    To do this, we need to configure Elasticsearch with an index template that will apply to data supplied by the IMS Connect Extensions feed. To do this, return to Kibana. In the Navigation pane, select Management. Under the Elasticsearch heading, select Index Management. Select Index Templates and then click Create a template.

    In the Name field, type “cex”. In the Index patterns field, type “cex-*”.


    Click Next, and then click Next again to get to Mappings. Insert the following text into the Dynamic Templates tab:

        "string_fields": {
          "mapping": {
          "type": "keyword"
          "match_mapping_type": "string",
          "match": "*"

    Click Next and then Next again to get to Review template. Click Request. The template should look like this:


    Click Create template. The “cex” template will be displayed in Index Templates section of the Index Management panel.



    You are now ready to start the IMS Connect Extensions feed.

  4. Activate the publisher API and submit JCL to start an IMS Connect Extensions feed.

    The IMS Connect Extensions feed is a client of the IMS Connect Extensions publisher API. The feed uses the publisher API to get IMS Connect events.

    For each IMS Connect system that you want to use as the feed source, select the Activate Publisher API option in the IMS Connect Extensions ISPF dialog. For further instructions on how to do this, see Starting an IMS Connect Extensions feed in the IBM Knowledge Center.

    After you have done this, you will need to start a feed job in the LPAR where your IMS Connect systems reside.

    The following JCL defines a feed that forwards data from three IMS Connect systems: ICONP01, ICONP02, and ICONP03. The feed consists of selected fields in JSON Lines format sent over unsecure TCP/IP (no SSL/TLS) to port 5044 on the host named “analytics”. The host is running Logstash configured with the TCP data input described in Step 1 of this tutorial.

    //SYSIN    DD *


    To visualize the data, simply return to Kibana, In the Navigation pane, select Vizualize, and then click Create new visualization in the same way you did in Step 2, only this time you can select your live data instead of the sample data.

    Happy trails :)

Join The Discussion