Analyze IoT sensor data with machine learning and advanced analytics

This code pattern is part of the Db2 Event Store learning path.

Level Topic Type
100 Achieve real time analytics, IoT, and fast data to gather meaningful insights Blog
101 Understand customer interests with clickstream analysis Code pattern
102 Analyze IoT sensor data with machine learning and advanced analytics Code pattern
103 Stream and store retail order data for analysis Code pattern
104 Stream data with Apache Kafka into the IBM Db2 Event Store Tutorial


In this code pattern, we’ll use Jupyter notebooks and IBM Streams to load IoT sensor data into IBM Db2 Event Store. From there, we’ll query and analyze the data using Jupyter notebooks with Spark SQL and Matplotlib. Finally, we’ll use Spark Machine Learning Library to create a model that will predict the temperature when given the power consumption and ambient temperature.


This code pattern demonstrates the use of Jupyter notebooks to interact with IBM Db2 Event Store — from the creation of database objects to advanced analytics and machine learning model development and deployment.

The sample data used in this code pattern simulates data collected by real industry IoT sensors. The IoT sample data includes sensor temperature, ambient temperature, power consumption, and timestamp for a group of sensors identified with unique sensor IDs and device IDs. A simple IBM Streams flow is used to stream the sample data from a CSV file to an Event Store table.

Db2 Event Store is an in-memory database designed for massive structured data volumes and real-time analytics, built on Apache Spark and Apache Parquet Data Format. The solution is optimized for event-driven data processing and analysis. It can support emerging applications that are driven by events such as IoT solutions, payments, logistics and web commerce. It’s flexible, scalable and can adapt quickly to your changing business needs over time.

After completing this code pattern, you’ll understand how to:

  • Interact with Db2 Event Store using Python and a Jupyter notebook.
  • Use IBM Streams to feed data into Db2 Event Store.
  • Visualize data using Matplotlib charts.
  • Build and test a machine learning model.
  • Deploy and use the model with Watson Machine Learning.



  1. Create the Db2 Event Store database and table.
  2. Feed the sample IoT data set into Db2 Event Store.
  3. Query the table using Spark SQL.
  4. Analyze the data with Matplotlib charts.
  5. Create and deploy a machine learning model.


Find the detailed steps for this pattern in the readme file. The steps will show you how to:

  1. Clone the repo.
  2. Install IBM Db2 Event Store and IBM Streams.
  3. Create an IBM Db2 Event Store database and table.
  4. Stream the IoT data into Event Store.
  5. Query the table.
  6. Analyze the data.
  7. Create and deploy a machine learning model.

To run Db2 Event Store in Cloud Pak for Data, please follow the steps in the Cloud Pak readme file.


This code pattern showed you how to create a temperature prediction model with Spark MLlib, Db2 Event Store, and Jupyter notebooks. The code pattern is part of the Learning Path: Db2 Event Store Learning path series. To continue the series and learn about Db2 Event Store features, take a look at the next code pattern, Stream and store retail order data for analysis.