Overview

Skill Level: Beginner

This recipe introduce the steps necessary to create IBM Cognos Analytics dashboards for IoT event data stored in a DB2 data warehouse on Cloud. You will learn how to create aggregated views, reporting data modules, dashboards and visualizations.

Step-by-step

  1. Introduction

    The Recipe ‘Analyze Elevator device data using IBM Watson Analytics‘ showed you how you can analyze elevator events using IBM Watson Analytics – a self serve reporting tool intended for use by regular (non technical) business users. Another  recipe titled ‘Create data warehouse for reporting on Elevator device data using DB2 Warehouse on Cloud’ showed how you could construct a Data Warehouse for reporting by Business Analytic tools using IBM Db2 Warehouse on Cloud (previously named IBM dashDB for Analytics).

    In this recipe we shall go through some basic steps using IBM Cognos Analytics to build dashboards and visualizations using elevator device data stored in the DB2 Warehouse on Cloud data warehouse. Unlike IBM Watson Analytics, technical skills will indeed be required to perform these tasks, more particularly skills that are needed for creating data warehouses, SQL statements, Cognos reporting data modules and dashboards.

    In this recipe you will create two dashboards that are of value when dealing with elevators that may enter maintenance mode if the motor temperature gets beyond 200 degree. One dashboard will show the location of the elevators, the elevator status (i.e. weather is in operation or not), the current motor temperature and the total number of maintenance stops for each elevator over time:

    01.1-Elevator-Health-Dashboard

    The second dashboard will show more details regarding the motor temperature such as average and maximum motor temperature at all times, as well as the maximum and average motor temperature day by day:

    01.2-Motor-Temperatures-Dashboard

    The recipe will take you through the following basic steps to analyze the device data:

    1. Sign in to IBM Cognos Analytics and get the IBM DB2 Warehouse on Cloud credentials needed to connect to the data warehouse using IBM Cloud.
    2. Create the database views to be used as a basis for reporting using the IBM DB2 Warehouse on Cloud console available in IBM Cloud.
    3. Establish a connection in IBM Cognos Analytics to the DB2 Warehouse on Cloud data warehouse.
    4. Create the Cognos data module needed for reporting over elevator data using IBM Cognos Analytics.
    5. Import a CSV file that stores information about the location of the elevators and add it to the data module using IBM Cognos Analytics.
    6. Create the dashboard showing elevator health using IBM Cognos Analytics.
    7. Create the dashboard showing motor temperatures using IBM Cognos Analytics.

    So in essence there are a few more steps related to setting things up before reporting can start compared to using IBM Watson Analytics. IBM Watson Analytics is clearly intended to be used by business professionals for self-serve reporting. IBM Cognos Analytics goes beyond that by providing an enterprise BI platform for governed data discovery and managed reporting that automates the creation of reports and dashboards. 

  2. Architecture

    Before you start it is important to recall some details on the data flow and the architecture of the elevator simulator application and it’ backend components for handling device data:

    01.4-Service-Architecture

    The elevator simulator sends the data to the Watson IoT service, which in turn stores the data in Cloudant NoSQL DB using e.g. the Historic Data Storage Extension or a specific Node-RED flow. From Cloudant the data is then synchronized to the IBM DB2 Warehouse on Cloud data warehouse using Cloudant Analytics. Another option is to use Node-RED flows to directly store the information in the data warehouse.

    All 3 approaches represent a fast path for getting the data into the data warehouse for subsequent development of dashboards and reports. The consequence is that the information for the devices is represented as very detailed level in the data warehouse: it is basically time series information generated every 5th second. This is far too detailed for e.g. the second dashboard. What we will need for reporting for this dashboard is filtered and/or aggregated data. For example, the charts showing the average and maximum motor temperatures above will be based on data aggregated on a per day basis. For what concerns the current elevator temperature and state in the first dashboard, the situation is a bit different. This dashboard will require the most current event data for each elevator, which means that the information from the time series data will need to be filtered.

    In this recipe we have taken the approach to provide the aggregated and filtered view of the data using database views defined in terms of appropriate SQL statements. These views will then be used as a basis for defining the reporting. However, using IBM Cognos Analytics reporting usually takes place using so-called data models (or data modules) that provide abstractions over the physical model of the database. The data modules allow you to name the tables and columns appropriately and to define e.g. aggregation functions, joins or derived computations of relevance to reporting.

    Consequently we will be working with a layered architecture for defining the dashboards as shown in the figure below:

    01.3-Architecture

    The database tables will represent event data exactly as they are send to the IoT platform in terms of e.g. device id (DEVICEID), timesstamp (TIMESTAMP) and motor temperature (MOTORTEMP). In contrast, the view will for example define the information on a per day basis and in terms of average (AVGMOTORTEMP) and maximum (MAXMOTORTEMP) motor temperature on the given day. On the next layer a Cognos Data model (or data module as they are called) is defined, which then in turn will be used for constructing the dashboards and the visualizations.

    The current approach using views for making information rapidly available for reporting allows experimentation with the dashboards and a fast iteration cycle whenever it is realized that the view definition need to be changed due to lessons learned or changing requirements to the dashboards. In real life one would probably not duplicate information 1-1 from the landing zone to the data warehouse. Instead, Extract-Transform-Load (ETL) jobs would need to be developed for filtering and aggregating the required information before storing it in the data warehouse at regular intervals for reporting. Defining ETL jobs and data tables for storing the aggregated data would however require more work up front before prototyping of the dashboards could start. Initially it is therefore faster to define the views over the raw event data and then use these aggregated and filtered views for reporting as shown in the figure above. Later on we could then introduce the required ETL jobs and define additional database tables for storing the aggregated data. The views could however be reused, but the definition of how the view is constructed would need to be changed to take the data from the new tables. However, as long as the schema of the view does not change this will have no impact on the definition of the reports.

    In this recipe we shall work with two data warehouse tables: ELEVATOR_EVENTS which contain all the device data for the elevators and ELEVATOR_ALERTS_NR which contains just the alerts generated when the motor temperature gets beyond the threshold. The table ELEVATOR_EVENTS was created by following the path Node-RED -> Cloudant NoSQL DB -> DB2 Warehouse on Cloud. For more details see the recipe ‘Store Elevator device data and alerts in Cloudant NoSQL DB‘. The table ELEVATOR_ALERTS_NR was, on the other hand, created directly by having a Node-RED flow save data to DB2 Warehouse on Cloud. For more details on that approach see ‘Create data warehouse for reporting on Elevator device data using DB2 Warehouse on Cloud‘.

     

  3. Getting Started

    Before you can start you will need to get access to IBM Cognos Analytics and get hold on the credentials for DB2 Warehouse on Cloud that you created in the recipe “Create data warehouse for reporting on Elevator device data using DB2 Warehouse on Cloud“.

    You can register with IBM Cognos Analytics by going through the following steps (you will need the IBM ID for the login):

    1. Click the link IBM Cognos Analytics.
    2. Click Sign In in the upper right corner:
      02.1-Sign-In-Cognos
    3. Enter your IBM ID and Passwords in the following dialogs.
    4. This process will eventually take you to the welcome page:
      02.2-Cognos-Analytics

    The welcome page contains links to tutorials and documentations. The toolbar to the left contains commands for searching, viewing your assets, creating new assets, uploading file and managing connections. On the top toolbar there is a Welcome button which will always allow you to get back to the welcome page.

    Next you will need to launch the dashboard of the DB2 Warehouse on Cloud service that was created in the recipe titled ‘Create data warehouse for reporting on Elevator device data using DB2 Warehouse on Cloud‘ and then get hold to the credentials for the service . To achieve this, do the following:

    1. In the IBM Cloud dashboard, select the data lake application named e.g. elevator-datalake-<your name> that you created in the previous recipe.
      02.3-CF-Applications
    2. Click the Connections tab, then Click the View credentials button for the DB2 Warehouse on Cloud (dashDB) service.
      02.4-Connections
    3. Copy the credentials and store them in a file on your local computer for later use.
      02.5-Credentials-1
    4. Then close the window by clicking the X button.
    5. Back in the Connections tab, select the IBM DB2 Warehouse on Cloud (dashDB) service to open up the dashboard for the service.
      02.6-dashDB-Service
    6. Click the OPEN button to open the DB2 Warehouse on Cloud console.
      02.7-DashDB-Portal
    7. In the DB2 Warehouse on Cloud console, click Explore to view the existing tables.
      02.8-DashDB-Explore
    8. Select the schema named by your dash DB user account, i.e. DASH<nnnn> (in the picture above DASH6769).

    Note the relevant schema and database names for later use. The schema name and the databases ELEVATOR_ALERTS and ELEVATOR_EVENTS will serve as a basis for defining the views in the next section, which will then later be used to define the data modules and dashboards.

  4. Create database views

    Reporting will be done on the basis of filtered and aggregated views of the device data. In this section you will get hold on SQL view definitions from a GitHub repository and then run the SQL statements using the IBM DB2 Warehouse on Cloud console to create the views.

    First, download the required SQL files from GitHub:

    1. Click the link “GitHub/EinarKarlsen/iot-create-dashdb-view/
      03.1-GitHubRepository-1
    2. Download the 3 files in the GitHub repository to your local computer and unzip the zip file.
    3. Open the two text files containing SQL statements with Wordpad (or another editor of your choice) and replace the schema name in the files (DASH6769) with the current schema name of your DB2 Warehouse on Cloud instance.
    4. Save the modified files.

    The file “Create SQL Views.txt” contains 3 view definitions. One is DASH6769.VW_ELEVATOR_EVENTS_BY_DAY that defines the aggregated temperatures on a per day basis to be used for the dashboard showing motor temperatures:

    CREATE VIEW DASH6769.VW_ELEVATOR_EVENTS_BY_DAY (
        DEVICEID,
        DEVICETYPE,
        MINMOTORTEMP,
        AVGMOTORTEMP,
        MAXMOTORTEMP,
        DATE
    ) AS
    SELECT
        DEVICEID,
        DEVICETYPE,
        ROUND(MIN(MOTORTEMP)) AS MINMOTORTEMP,
        ROUND(AVG(MOTORTEMP)) AS AVGMOTORTEMP,
        ROUND(MAX(MOTORTEMP)) AS MAXMOTORTEMP,
        DATE(substring(TIMESTAMP,1,10)) AS DATE
    FROM DASH6769.ELEVATOR_EVENTS
    GROUP BY DEVICEID, DEVICETYPE, DATE(substring(TIMESTAMP,1,10));

    The key elements of the view definition are the GROUP BY statement and the functions AVG, MIN and MAX. Notice that the SQL statement also rounds the temperatures to integer numbers so that the information is better prepared for analyzing e.g. the distribution of temperatures.

    Next, create the views using the IBM DB2 Warehouse on Cloud console:

    1. Select the IBM DB2 Warehouse on Cloud console in your browser.
    2. Select the Run SQL button at the top of the console.
      03.2-dashDB-Run-SQL
    3. Copy the content of the file “Create SQL Views.txt” into the SQL Editor as shown above.
    4. Click Run All.
    5. Check that all SQL statements have succeeded.
      03.3-dashDB-Job-Status
    6. Copy one of the SELECT statements of the file “Query and Delete SQL Views” to the SQL editor and click Run All again.
    7. Validate that the view delivers appropriate results:
      03.4-dashDB-SQL-Query
  5. Create data source connections

    Now that you have successfully created the views with aggregated and filtered data, time has come to connect to the database from within IBM Cognos Analytics and then continue from there. For that purpose you will need the credentials for the IBM DB2 Warehouse on Cloud service that you retrieved in section “Getting Started”.

    Go through the following steps to create the connection:

    1. In the bottom left toolbar, click Manage.
      04.1-Manage-Connections
    2. Select Data server connections.
      04.2-Data-Server-Connections
    3. Select + to create a new connection.
      04.3-Data-Server-Connections
    4. Select dashDB as the type of the connection.
      04.4-Select-Connection-Type
    5. Observe that IBM Cognos Analytics does indeed support quite a variety of other data sources.
    6. In the next window copy and paste the JDBC connection URL from the DB2 Warehouse on Cloud service credentials to the JDBC URL text field:04.5-JDBC-Connection-String
    7. For authentication methods, select Use the following sign on as shown in the screenshot above.
    8. Click the + button as indicated to open up the editor where you can enter the credentials.
    9. Copy and paste the user name and the password from the DB2 Warehouse on Cloud service credentials that you saved in section “Getting Started” to the text fields of the dialog below:
      04.6-Credentials
    10. Check that the system gets back saying that the passwords match.
    11. Provide a name for the connection, e.g. “Elevator DB“.
    12. Test the connection by clicking the Test button to the left of the screen.
      04.7-Test-and-Save-Connection
    13. Save the Connection.

    The connection has now been established. However, you will also need to select the schema and relevant databases before work on the reporting data modules can start.

    1. Click Manage in the lower left corner again.
    2. Select the Data Server Connection that you just created named “Elevator DB” (see screenshot below).
    3. In the next tab, click the Connection tab.
    4. Select the connection named “Elevator DB”.
      04.8-Manage-Connection
    5. In the next tab, select the Schemas tab.
    6. Select the schema named DASH6769 and invoke load metadata from the popup menu.
    7. When loading of the schema for the database has finished, invoke load options from the popup menu.
      04.9-Load-Tables
    8. Select the Tables tab in the upper part of the screen.
    9. Now make sure that only the tables VW_ELEVATOR_ALERTS, VW_ELEVATOR_EVENTS_BY_DAY and VW_ELEVATOR_STATUS are selected.
    10. Click Load.
    11. Wait until IBM Cognos Analytics get back with the message that the schemas have been loaded.
  6. Create data modules

    Having defined the views and establsihed the connection to the DB2 Warehouse on Cloud data warehouse, time has come to create the data modules (i.e. models) need for reporting. You can create a data module by fusing together many sources of data that includes relational databases, Hadoop-based technologies, Microsoft Excel spreadsheets, and text files. Star schemas are the ideal database structure for data modules but transactional schemas are equally supported.

    In this and the next section you shall create a data module to report on the following tables:

    1. Elevator Events by Day which contains aggregated information for each elevator on a day by day basis (based on the view VW_ELEVATOR_EVENTS_BY_DAY),
    2. Elevator Alerts which contains information on the times the elevators were put into maintenance mode (based on the view VW_ELEVATOR_ALERTS).
    3. Elevator Status which contains information regarding the current state of each elevator (based on the view VW_ELEVATOR_STATUS),
    4. Elevator Locations, which unlike the other 3 data models will be created by importing an Excel file rather than on basis of a database view (in the next section).

    This part of the recipe will take you through the following steps:

    1. Create tha data module and associate it with the ElevatorDB server.
    2. Add a table to the data model and refine its definition.
    3. Add join relationships between the tables.

    The instructions for adding and refining the first table will be pretty detailed. For the next you will just need to repeat the process.

    To create the data module named Elevator Data Model do the following:

    1. Click the + New button in the lower left corner.
      05.1-Create-Data-Module
    2. Select the Data Module menu item.
    3. In the next screen, select Data servers, which will cause IBM Cognos Analytics to list the available servers.
    4. Select the database server named Elevator DB that you created in the previous section.
      05.2-Select-Database
    5. Select the schema named DASH<nnnn>, where <nnnn> is the number of your DB2 Warehouse on Cloud instance.
      05.3-Select-Database
    6. Click the Start button in the left part of the screen to render the data sources available for the database server.
    7. IBM Cognos Analytics will display the data sources as a result.
    8. Click the icon for the Save operation at the top left of the screen (the Disk icon).
      05.6-Save-Model-1
    9. In the next dialog, name the data module “Elevator Data Model”.
    10. Select the folder My content as folder
    11. Click Save.

    You have now created the data module. We shall now add the data source for Elevator Events by Day to the model.

    1. Click the triangle in the left part of the screen under the name DASH<nnnn> to expand its definition.
      05.4-Data-Sources
    2. This will show you the tables/views available for creating the data model in the DASH database server.
    3. Drag and drop the data source named Vw Elevator Events by Day to the right part of the screen.
      05.5-Initial-Data-Module
    4. This will show you the query and its individual query items such as Deviceid, Devicetype, Minmotortemp and Date.
    5. Select the Vw Elevator Events by Date table in the Data Module tab and click the 3 dots to the right to open the context menu.
    6. Invoke Rename and delete the Vw prefix from the name (it is not needed).
    7. Save the changes to the model.

    IBM Cognos Analytics have identified that some of the properties are identifiers and that others are measures – as indicated by the symbol5.7-Measure. You will consequently need to refine the data model before it is ready to be used.

    1. Select the property named Deviceid, slick the 3 dots to the right and select Properties from the Popup menu.
    2. Check that the property has been correctly classified as an Identifer.
    3. Select the property named Minmotortemp.
      05.8-Query-Item-Properties
    4. Change Usage to Measure and Aggregate to None.
    5. Repeat the same for the properties Avgmotortemp and Maxmotorremp.

    Moreover, you will also need to define a derived computation Event Count that can be used to compute the count of events in the various charts.

    1. Select the property named Deviceid, click the 3 dots to open the Popup Menu.
    2. Select the menu item Create custom calculation.
    3. Name the property “Event Count”.
      05.13-Computation-1
    4. Drag and drop the property DeviceId unto the text field as shown.
    5. Click Validate to check if the expression is well formed.
    6. Click OK to save the changes.
    7. Open the properties for Event Count.
    8. Set Usage to Measure and Aggregate function to Count.
    9. Save the changes to the data model.

    This completes the data model for Elevator Events by Day. It should now look line:

    05.15-Elevator-Events-by-Day

    Repeat the process for the view named Vw Elevator Alerts. The table should, when you are finished, look like:

    05.14-Elevator-Alerts

    For Elevator Status, set the following properties:

    • Deviceid and Devicetype have Usage set to Identifier and Aggregate set to Count.
    • Motortemp, Cabinspeed, Cabintemp and Load have Usage set to Measure and Aggregate set to None.
    • Currentfloor, Direction, Dooropen, Maintenancereason, Maintenancestop, Numberoffloors, State and Timestamp have Usage set to Attribute and Aggregate set to Count.

    Then define the Event Count computed item with exactly the same definition as before. Save the Elevator Data Model.

    Do the similar steps for the view named Vw Elevator Status. At the end the table should like:

    05.16-Elevator-Status

    For Elevator Status, set the following properties:

    • Deviceid and Devicetype have Usage set to Identifier and Aggregate set to Count.
    • Motortemp, Cabinspeed, Cabintemp and Load have Usage set to Measure and Aggregate set to None.
    • Currentfloor, Direction, Dooropen, Maintenancereason, Maintenancestop, Numberoffloors, State and Timestamp have Usage set to Attribute and Aggregate set to Count.

    For this source you will not need to define an Event Count custom computation, insofar that we know that there will be exactly one record for each elevator in the system. However, we need two other custom computations.

    • The computation that defines the MaxAllowedMotorTemp: an integer with value set to 200. You can see the definition in the screenshot above.
    • The computation that defines whether an elevator is running or not named Running.

    The computation Running is defined by the expression:

    IF ( [MAINTENANCESTOP] = 1 ) THEN (0) ELSE (1)

    Validate the expression to be certain that it works:

    05.18-Elevator-Status-Running

    You have now created all 3 data modules needed for defining the dashboards when reporting over data stored in the data warehouse. There is one last step needed: to define join relationships between the tables.

    1. Refresh the browser screen. The right part of the screen should render the 3 tables as shown below.
      05.19-Elevator-Data-Model-and-Data
    2. You can select Elevator Status and then drag and drop the symbol marked above upwards and IBM Cognos Analytics will show you the values in the table.
    3. Once Elevator Status has been selected, do a right click to get the popup menu.
      05.20-Join-Command
    4. Select the Join menu item.
      05.21-Create-Join-Relationship
    5. In the resulting dialog, select Elevator Alerts as the second table.
    6. Select Deviceid as the attribute to be used for the join.
      05.22-Cardinalities
    7. Click the arrows in the lower part of the dialog and set Join type and Cardinality as shown above.
    8. Click OK to save the definition.
    9. Save the changes to the data module.

    When the dialog closes you will see the relationship between the two tables.

    There is one more data source of relevance: an Excel spreadsheet defining properties such as the locations of the elevators, the type of the elevator as well as the year it was produced. You will upload that Excel spreadshet next, add it to the data model and create the missing join relationships as shown below.

    05.23-Final-Result

     

     

     

  7. Upload file with elevator locations

    In this section you will upload the Excel spreadsheet and definetable for the Excel spreadsheet next. The Excel spreadsheet was part of Github repository that you downloaded at the very beginning of section 3. You can take it from there or from the local download.

    Then continue by following these steps:

    1. Click the Upload File command in the bottom left corner of the screen:
      05.19-Upload-File
    2. In the resulting Open file dialog, navigate to the location of the Excel file, select it and click Open.
    3. In the resulting window view the columns of the Excel file.
      05.20-Uploaded-File
    4. You can in principle discard individual columns should they not be needed by invoking the Discard command in the upper right corner.
    5. However, we will leave the file as it is, so Click OK to accept the upload.

    Next you will need to add the new source to the Elevator Data Model and ensure that added data source is appropriately defined:

    1. Back in the editor for the Elevator Data Model, click the + button in the left corner.
      06.3-Add-File-Model
    2. In the resulting window select Uploaded files as data source type.
      06.4-Add-File-Model
    3. Select the file Elevator Locations.xlsx.
    4. Click the Done button in the left part of the screen.
    5. Drag and drop the new data source to the Data module definitions at the right part of the screen.
      06.5-Elevator-Locations
    6. Rename the data table to “Elevator Locations” (we do not need the .xlsx suffix)
    7. Save the data module.
    8. Expand the new table and select the Year item.
    9. Invoke the Properties menu item from the popup menu.
      06.6-Elevator-Locations-Properties
    10. Set Usage to Attribute and Aggregate to Count. Observe that the import has correctly identified that Year represents a Time dimension.
    11. Select Zipcode and check that the property Represents is set to Geographic Location and Postal Code.
    12. Close the Properties tab by clicking the X button in the upper right corner.
    13. Save the data model.

    To complete the definition create join relationships between Elevator Location on the one hand and all 3 database tables on the other.

    1. Select Elevator Locations, either in the Data module tab or in graphical editor.
    2. Invoke Join from the popup menu.
    3. Define the join relationship between Elevator Locations and Elevator Alerts as shown below:
      06.7-Join-Relationship
    4. Repeat the process for the two other database tables.
    5. Save the data module.
  8. Create elevator health dashboard

    You are now – after quite a few preparation steps – ready to define your first dashboard. In many cases the preparation steps are faster than demonstrated in this recipe.It can simply boil down to creating the connection to the data warehouse, create the data module and then immediately start reporting using default settings for the data model. However, we have walked through a number of steps that are important in demonstrating a more refined and structured workflow for setting the scene for creating proficient dashboards and reports.

    You will go through the following steps in this section:

    1. Create a dashboard for elevator health using a dashboard template.
    2. Assign the required data source to the dashbaord.
    3. Create and name 5 empty visualization for the dashboard:
      1. A map visualization that show the geographic location of the elevators.
      2. A heat map visualization showing – for each elevator – the current elevator status, i.e. if the elevator is up running or not.
      3. A line and column visualization that shows the current motor temperature for each elevator as well as the allowed max temperature.
      4. A pie visualization showing the number of maintenance stops in total over time for each elevator.
      5. A clustered column visualization that shows the maintenance stops by date.
    4. Add data model elements to the visualizations and specify properties of the visualizations such as sorting and aggregation.
    5. Test the dashboard.

    To create the dashboard perform the following steps:

    1. Select the +New Button in the bottom left corner of the screen.
    2. Select Dashboard.
      07.1-Dashboard-Template
    3. Select the dashboard template as shown above and click OK.
    4. Select the Tap at the top of the screen, then click the Edit the title button (the pencil).
      07.2-Edit-Title
    5. Name the tab “Elevator Health”.
    6. Save the dashboard to the folder My content and give it the name “Elevator Dashboard”.

    To assign data sources to the dashboard perform the following steps next:

    1. Select the icon for the data sources in the toolbar to the left of the screen:
      07.3-Select-Source
    2. Click + to select a new data source.
    3. Select the folder My content, then select the data module named “Elevator Data Model” that you created previously.
    4. Click Open. The following data model should then show up:
      07.5-Data-Sources

    Using Cognos Analytics you can add Media, Shapes, Text and Visualizations to the dashboard as indicated by the icons in the toolbar to the left of the screen.

    Using the last option (rendered as a bar chart) you can now add visualizations to the dashboard as follows:

    1. Select the icon for adding visualizations:
      07.6-Add-Map
    2. Select the Map visualization as shown.
    3. In the map, click the Collapse button at the top right.
      07.7-Inserted-Map
    4. Resize the map using the markers at the corners of the visualization so that the map fills the left half of the screen.
    5. Click on the map. In the popup menu, select the menu item Edit the title.
      07.8-Edit-the-title
    6. Enter “Elevator Locations” for the title.
    7. Save the dashboard.
    8. Now add a heat map, a line and barchart, a pie chart and a line chart to the dashboard.
    9. Resize and rename the visualizations until the dashboard has the following appearance:
      07.10-Dashboard-Outline
    10. Save the dashboard.

    You will now complete the visualizations one by one by associating data model elements with the properties of the visualizations.

    Define the map first:

    1. Select the Sources button in the upper right corner of the toolbar to render the data sources of the dashboard (see screenshot below).
    2. Expand the table Elevator Locations so that the columns of the table are shown.
    3. Select the map that you created. This will change the style of the map so that it renders the icons for resizing the map.
    4. Select the Expand icon (an arrow) in the upper right corner of the map.
      07.11-Geopraphic-map
    5. Drag and drop the column Country to the Regions property of the map.
    6. Collapse the map.
    7. Save the dashboard.

    Next, define the details of the Elevator Status heatmap:

    1. Expand the Elevator Status heatmap that you created.
    2. Expand the table Elevator Status.
    3. Drag and drop the column Deviceid to the X Axis of the heat map.
    4. Drag and drop the column Running to the Y Axis of the heat map.
    5. Drag and drop the column Motortemp to the Color field of the heat map.
      07.12-Elevator-Status-heatmap
    6. Collapse the heat map visualization.
    7. Save the dashboard.

    Continue by defining the details of the Motor Temperature bar and line chart:

    1. Expand the Motor Temperature visualization.
    2. Expand the Elevator Status table if not already done.
    3. Drag and drop the column Deviceid to the X Axis property of the visualization.
    4. Drag and drop MaxAllowedMotorTemp to the Line Value property.
    5. Drag and drop Motortemp to the Column Value property.
      07.13-Motor-Temperature-bar-and-line-chart
    6. Collapse the visualization.
    7. Save the dashboard.

    Time has come to complete the definition of the Maintenance Stops pie chart visualization:

    1. Expand the Maintenance Stops visualization.
    2. Expand the table Elevator Alerts.
    3. Drag and drop the column Deviceid to the Categories property of the visualization.
    4. Drag and drop the column Event Count to the Value property.
      07.14-Maintenance-Stops-pie-chart
    5. Collapse the visualization.
    6. Save the dashboard.

    Finally, provide the semantics of the Maintenance Stops over Time visualization:

    1. Expand the Maintenance Stops visualization.
    2. Expand the table Elevator Alerts.
    3. Drag and drop the column Date to the X Axis property.
    4. Drag and drop the column Deviceid to the Color property.
    5. Drag and drop the column Event Count to the Value property.
      07.15-Maintenance-Stops-over-Time-chart
    6. Select the X Axis.
      07.16-Sort-Column
    7. Select the Sort icon.
      07.17-Sort-Ascending
    8. Select Sort ascending so that the dates are rendered in the correct order with the X axis as the timeline.
    9. Collapse the visualization.
    10. Save the dashboard.

    You have now completed the first part of the dashboard and can test it in its whole. For example, try to select Elevator01 in the pie chart, and notice how all other visualizations on the dashboard change their rendering to focus on Elevator01:

    07.18-Test-Map

    To get back to a state where there are no filters, click the filter icon in the visualization, then click the X button in the popup as shown above. This will remove the filter that you set by clicking on the slice in the pie chart. Filtering works of course for the other visualizations as well.

    Most of the elements of a visualization have properties that can be set, such as sorting, filtering and aggregation. You can actually change the color palette and other properties for the visualization as well. For example, by selecting the Elevator Status visualization, and then the Properties button in the toolbar, you will get access to the properties for the visualization:

    07.19-Visualization-Properties

    Enable Refresh automatically and keep the Minimum seconds interval as set by default. Do the same for the Motor Temperature visualization. Then save the dashboard.

  9. Create motor temperature dashboard

    You will create the dashboard for the motor temperatures in this section. Insofar that you should now be familiar with the usage of IBM Cognos Analytics for creating Tabs and visualization, the instructions will be kept very concise.

    As in the previous section, create the layout of the dashboard first:

    1. Click the + Button at the top of the screen to create a new Tab.
      08.01-Add-new-Tab
    2. Select the same template as in the previous section (the one that divides the screen into 4 equally large pieces).
    3. Click the Use button.
    4. Name the tab Motor Temperatures.
    5. Save the dashboard.
    6. Select the Visualizations icon in the left part of the screen.
      08.2-Visualizations
    7. Insert a clustered bar chart.
    8. Collapse it and resize it to fill the left half of the dashboard.
    9. Insert two Smooth Line Charts and resize them to fit the right half of the screen:
      08.3-Motor-Temperatures-Layout
    10. Name the visualizations as shown above.
    11. Save the dashboard

    Now, in a next step, define the detailed semantics of each visualization as in the previous section:

    1. Expand the Minimum and Maximum Motor Temperatures clustered bar chart.
    2. Select the Sources icon in the toolbar to the left.
    3. Expand the Elevator Events by Day Query.
    4. Drag and drop the column Event Count to the X Axis property of the visualization.
    5. Drag and drop Deviceid to the Color property.
    6. Drag and drop Minmotortemp to the Value property.
    7. Drag and drop Maxmotortemp to the next Value property (that was dynamically make available).
      08.4-Min-and-Max-Motor-Temps

    Notice that if the data is not aggregated correctly you can change it now. If for example Minmotortemp would be an aggregated number, simply select the property above and change aggregation using the popup menu (similar to the way you change sorting in the previous section).

    Next, continue with the Average Temperature over Time visualization:

    1. Expand the Average Temperature over Time visualization.
    2. Drag and drop Date to the X Axis.
    3. Select the property.
    4. Set Sort ascending for the property using the popup menu.
    5. Drag and drop Deviceid to the Color property.
    6. Drag and drop Avgmotortemp to the Y-Axis.
      08.5-Average-Motor-Temp
    7. Collapse the visualization
    8. Save the dashboard.

    Repeat the same set of steps for the Maximum Motor Temperature over Time visualization but using the column Maxmotortemp rather than Minmotortemp.

    The dashboard should now look like:

    08.6-Max-Motor-Temp

    You can change the layout at this point of time. Actually, the visualization to the left is not looking that good. To change the layout, do the following:

    1. Select the Minimum and Maximum Motor Temperature visualization.
    2. From the popup menu select the last menu item named Change Visualization.
    3. Select Stacked Column.

    The dashboard now changes to:

    08.7-Max-Motor-Temp-New-Viz

    However, aggregation is now performed incorrectly on the data. Simply Expand the visualization and change aggregation of Maxmotortemp to Maximum and aggregation of Minmotortemp to Minimum and the chart renders correctly again.

  10. Conclusion

    In this recipe you have gone through some basic steps required to create dashboards for monitoring IoT event data using IBM Cognos Analytics and IBM DB2 Warehouse on Cloud. Some of the steps like connecting to the data warehouse, creating the database views and the data modules required technical skills typically mastered by Business Intelligence professionals. However, IBM Cognos Analytics also offers self-service capabilities so that business users can quickly and confidently create and personalize dashboards and reports on their own. Creating new dashboards with visualizations was indeed straightforward once that the data models had been prepared.

    In this recipe reporting was basically done with data at rest using time series information that was injected by the Watson IoT Service and stored in IBM DB2 Warehouse on Cloud. This data was at a very detailed level of granularity insofar that each elevator will generate an event every 5 second. The dashboards however required just the last set of event data for each elevator to determine its current status as well as data aggregated on a day to day basis. In this recipe we have taken a fast path by simply prototyping the dashboards using views over the time series data. In real life we would however need to optimize the solution in order to save storage. There is no need to store information in a data warehouse if it is not needed for reporting. The information will just unnecessarily add to the amount of data being stored, and thus, if we are dealing with a Cloud solution, to the costs of storing and retrieving the data.

    In other recipes we shall therefore look into two aspects in order to optimize the solution shown in this recipe:

    • Extract-Transform-Load jobs that will read information from the IBM Cloudant NoSQL database once a day and then transform the data calculating the minimum, average and maximum motor temperature for that day and subsequently store it in a table with aggregated data. The view VW_ELEVATOR_EVENTS_BY_DAY will be changed to take data directly from that table with no further aggregation needed. Such a solution will work very well using the Historic Data Storage Extension of the IBM Watson IoT Service, in particular if the bucket interval is set to 1 day. Having run the ETL job, the bucket can in fact be archived unless it is needed for other purposes. The recipe ‘Aggregate Elevator event data using IBM Data Science Experience‘ shows how this can be achieved using IBM Digital Science Experience and its underlying technology in form of Apache Spark, Jupyter Notebooks and Python.
    • Streaming Analytics for performing near real time analytics on the current state of the elevators and on elevator alerts. This will require IoT events to be send from the Watson IoT Service to the IBM Message Hub server as well as consumer agents to process the events. The consumer agents will get the data from the Message Hub server and store them in the relevant tables. The database view definitions will need to be changed to take the information from the new base tables, but the definition of the data models, the dashboards and visualizations will remain unchanged. A first step in this direction can be found in the recipe ‘Processing Elevator device data using the Watson IoT Platform and Node.js Consumers‘, which shows how to develop a Node.js consumer that reads event data directly from the Watson IoT Platform and then stores it in a dedicated database table containing information about the most recent status of the elevators.
  11. Acknowledgement

    I would like to thank Martin Kramer for providing valueable input in best practices for using Cognos Analytics as well as for tips and tricks relevant in defining the data modules and the dashboards.

Join The Discussion