Overview

Skill Level: Any Skill Level

This article describes how to configure ELK in order to receive and view the data sent from IZPCA or IZDS via IZCDP.

Ingredients

There are different ways to send the data from IBM Z Decision Support or IBM Z Performance and Capacity Analytics off the z/OS platform, but I will describe how to send the data from their Db2 database (near real-time or historical) through IBM Z Common Data Provider to Elasticsearch Platform.

The first part of the article will cover the work that needs to be done on z/OS and the second part will cover the work that needs to be done on Elasticsearch platform.

I will use the following acronymous through these articles:

  • IZPCA – IBM Z Performance and Capacity Analytics (or IBM Z Decision Support)
  • IZCDP – IBM Z Common Data Provider
  • IZDS – IBM Z Decision Support
  • ELK – Elasticsearch Platform (Elasticsearch, Logstash and Kibana)

Overview-2

The Shadower is reading the data from Db2 database using JDBC driver. That data can be historical or near-real time. The Shadower is then sending the data to Data Streamer. Data streamer is receiving the data according to the policy and then sending to the designated subscriber, in this case ELK.

You need supported z/OS operating system and:

IBM Z Performance and Capacity Analytics 3.1 or IBM Z Decision Support 1.9
IBM Z Common Data Provider 2.1 or IBM Z Operations Analytics 4.1


You also need the Elastic Stack 7.8.1 components installed and started on any supported platform:

  • Logstash
  • Elasticsearch
  • Kibana


Always implement the newest maintenance to IZPCA or IZDS, and you can find what is new here:

  • IZDA: https://www.ibm.com/support/pages/ibm-z-decision-support-v190-maintenance-and-new-function  
  • IZPCA: https://www.ibm.com/support/pages/ibm-z-performance-and-capacity-analytics-v310-maintenance-and-new-function

You can check Part 1 where I described how to implement the Shadower to send IZDS or IZPCA data to IZCDP.

Step-by-step

  1. Installing ELK on Linux

    Download ELK

    Download the supported version of Elasticsearch, Logstash and Kibana:

    https://www.elastic.co/downloads

    At the moment IZPCA supports ELK version 7.8.1, so I downloaded these files in folder /opt/ELK-7.8.1

    • elasticsearch-7.8.1-linux-x86_64.tar.gz
    • kibana-7.8.1-linux-x86_64.tar.gz
    • logstash-7.8.1.tar.gz

    Requirements

    Elastic Stack is running with bundled version of OpenJDK (Java) which is recommended way. If you require your own Java, you have to set JAVA_HOME variable. If you already have JAVA_HOME set and it is not supported (not working), then for the user that will run ELK, set:

    # export JAVA_HOME=

    Install Elasticsearch 

    Unzip Elasticsearch

    # cd /opt/ELK-7.8.1/
    # tar -xzf elasticsearch-7.8.1.tar.gz
    # cd /opt/ELK-7.8.1/elasticsearch-7.8.1

     This directory is known as $ES_HOME

    Configure Elasticsearch

    This step is optional if you will have all components on the same system (Elasticsearch, Logstash, Kibana and you are suing localhost.

    Edit $ES_HOME/config/elasticsearch.yml

    cluster.name: elk-rhel72
    network.host: rhel72.ibm.com
    discovery.seed_hosts: ["rhel72.ibm.com"]cluster.initial_master_nodes: ["node-rhel72"] 

    That is also the place where you can change the port for Elasticsearch. The default is: 9200.

    The default network.host or hostname is localhost or 127.0.0.1 for all Elastic Stack component, so if you are not exposing them to external machine, it is not required to change network setup.

    Installing Kibana 

    Unzip Kibana

    # cd /opt/ELK-7.8.1/
    # tar -xzf kibana-7.8.1-darwin-x86_64.tar.gz
    # cd /opt/ELK-7.8.1/kibana-6.6.0-linux-x86_64

    This directory is known as $KIBANA_HOME

    Configuring Kibana

    Open config/kibana.yml in an editor

    Set elasticsearch.hosts to point at your Elasticsearch instance, if on different host or if any of the components will be exposed to external machine, such as Problem Inisght server

    #server.port: 5601
    server.host: "rhel72.ibm.com"
    elasticsearch.hosts: ["http://rhel72.ibm.com:9200"] 

    Installing Logstash 

     Unzip the files

    # tar -xzf logstash-7.8.1.tar.gz
    # cd /opt/ELK-7.8.1/logstash-7.8.1/

    Configuration

    We will configure it later in Configuring Logstash step

  2. Downloading and installing the IZPCA dashboard files

    The files tha you have to download are part of the SMP/E installation IZPCA filesystem: /usr/lpp/IBM/IZPCA/v3r1m0/IBM

    These files are tar files, When you extract the content of them you get:

    DRLWING (Ingestions)

    • B_IZDS_Input.lsh
    • E_IZDS_Filter.lsh
    • Q_IZDS_Output.lsh

    DRLWECPA (IZPCA reporting package)

    • IZDS_Indexes.ndjson
    • IZDS_Visualizations.ndjson
    • IZDS_Dashboards.ndjson

    You can extract this file with tar -xvfo source -C destination on z/OS and ftp in binary to target platform or you can ftp tar files in binary and extract them on the target platform

    Let’s assume the files are extracted in /opt/ELK-7.8.1/IZPCA directory.

  3. Configuring Logstash

    Place Ingestions files in logastash configuration directory, for example izpca-config

    # su – virtuser
    # mkdir /opt/ELK-7.8.1/izpca-config
    # cp /opt/ELK-7.8.1/IZPCA/*.lsh /opt/ELK-7.8.1/izpca-config

    Edit Pipelines.yml to add izpca pipeline

    - pipeline.id: izoa
    #   queue.type: persisted
      path.config: "/opt/ELK-7.8.1/zoa-config/*.conf"
    - pipeline.id: izpca
    #   queue.type: persisted
      path.config: "/opt/ELK-7.8.1/izpca-config/*.lsh"
     

    Edit B_IZDS_Input.lsh and Q_IZDS_Output.lsh

    Set the correct port in B_IZDS_Input.lsh:

    input {
      tcp {
        port => 8082
        codec => json
        tags => [ "izds_cdpz_tcpip" ]  }

    Set the correct name in Q_IZDS_Output.lsh:

    output {
      if "izds_cdpz_tcpip" in [tags] or "izds_cdpz_file" in [tags] or "izds_raw_file" in [tags] {
        elasticsearch {
          hosts => ["rhel72.ibm.com:9200"]      document_id => "%{id}"
          index => "%{TABLE}-%{+YYYY.MM}"
        }
      }
  4. Starting Elasticsearch and Kibana

    ELK has to run under the user different than root, for example virtuser.

     So, with root, change the ownership of the folder where ELK is installed:

    # chown -R virtuser /opt/ELK-7.8.1

    Logon with virtuser

    Start Elasticserach:

    $ /opt/ELK-7.8.1/elasticsearch-7.8.1/bin/elasticsearch

    Start Kibana:

    $ /opt/ELK-7.8.1/kibana-7.8.1-darwin-x86_64/bin/kibana

    To test Kibana, point your browser at http://hostname:5601

  5. Configuring Kibana

    Sign into Kibana if authentication is enabled.

    http://hostname:5601

    Navigate to Management, then select Saved Objects.

    Import the provided IBM Z Performance and Capacity Analytics newline-delimited JSON files in the following sequence:

    1. IZDS_Indexes.ndjson
    2. IZDS_Visualizations.ndjson
    3. IZDS_Dashboards.ndjson

     

    Configuring Kibana Homepage

    This is IZPCA Kibana Homepage:

    http://hostname:5601/app/kibana#/dashboard/d4d23d00-f0bb-11ea-a417-2f876e2492bd

  6. Starting Logstash

    Logon with the user different from root, for example virtuser:

    Start Logstash:

    $ /opt/ELK-7.8.1/logstash-7.8.1/bin/logstash

  7. Sample reports

    LPAR 4 HOUR MSU Utilization by Day

    LPARMSU

    LPAR MSU Hourly Comparison by Day of Week

    ELK

  8. Next steps

    Now, you can check Part 1 where I described how to implement the Shadower to send IZDS or IZPCA data from Db2 database to IBM Z Common Data Provider.

Join The Discussion