Cloud Pak for Data

 View Only

Analyze streaming data with Python and Cloud Pak for Data

By NATASHA D'SILVA posted Thu October 08, 2020 09:58 PM

  
Imagine you are a developer for an energy company and you would like to continuously ingest and analyze data from thousands of sensors in the field. You would like to compute some statistics for each sensor, such as the rolling average, and then store that data or further analyze it by applying a machine learning model to it.

Streams in Cloud Pak for Data is well suited for these tasks, and you can create such an application in Python using the Streams Python API.

You can create and deploy such an application from any development environment, including Jupyter Notebooks.

The following video is a walk through of creating such an application from a Jupyter Notebook in a Cloud Pak for Data project.
It will demonstrate:

  • importing the sample notebook
  • creating the application and deploying it
  • accessing the live result feed from the notebook
  • Additional ways to monitor the application

Streaming analytics with Python in Cloud Pak for Data

Try the sample

Download the sample notebook from GitHub

Next Steps

After running the introductory sample, here are additional tutorials for using the Python API in Cloud Pak for Data that you can try.

Ingest streaming data from IBM Event Streams or Apache Kafka

This sample used simulated data, but you can ingest data from IBM Event Streams or Kafka into Streams.


Score a model from Watson Machine Learning on streaming data

You can extract additional insights by scoring a model from Watson Machine Learning on the streaming data.
Try the notebook to score a model from Watson Machine Learning.


Useful Links

See other sample notebooks
Streams Python API development guide








#CloudPakforDataGroup
#Featured-area-1
#Featured-area-1-home
#Python
#streaming-analytics
#Streams
0 comments
129 views

Permalink