Introduction to Streams Flows in Watson Studio

Watson Studio Streams Flows is a web based IDE for quickly creating streaming analytics applications. The applications are created in the browser and run on the IBM Cloud. This post is collection of short videos and articles to introduce you to the the canvas and show you how to create and monitor an application.

Table of Contents

Watson Studio Streams Flows overview

Why should you use Watson Studio Streams Flows? This video provides an overview as well as an introduction to the canvas.

 

Sign up for a free trial

Streams Flows are part of the IBM Watson Studio, so you’ll first need to log in/create an account. Watch this short video to see how to sign up for a free trial of IBM Watson Studio.

Set up the IBM Cloud Services

IBM Watson Studio Streams Flows run on the Streaming Analytics service in the IBM Cloud, so after creating an account, follow along in this video to set up the needed services.

Now that your setup is complete, a great way to try out Streams Flows is by running an example.

Create a Data Historian Example Flow

In this video, you will learn how you can deploy the Data Historian example flow that is available in Watson Studio Streams Flows. This flow ingests data from simulated weather stations and uses the Aggregation operator in Watson Studio Streams Flows to compute statistics like average temperature and humidity.

 

Learn more about this example

Monitor the Running Flow

The next video in the series demonstrates how you can monitor a running application using the metrics page. You can observe the application’s performance, see the data as it moves between operators and download application logs.

Create Your Own Streams Flow


After running an example flow and learning how to interact with a running flow, you’re now ready to create your own applications.

Create a Streams Flow with the Canvas

Extend the Data Historian Example to Use Event Streams as a Data Source

You’ve succesfully run a flow in Watson Studio Streams Flows. Now, you probably want to start creating your own applications. Logically, the first step in creating your own flow is connecting to a data source. Right now, supported data sources are the Watson IoT Platform or Event Streams (formerly Message Hub). So now you need to learn how to send data to one of those services.

Follow along in this notebook to see how to modify the Data Historian flow to use data from Event Streams, IBM’s Apache Kafka offering. You will learn how to 1) Send data to Event Streams using Python, 2) Ingest and analyze that data in a Streams flow, and 3) Send results from the flow back to Event Streams.

Open the notebook on the Watson Studio (formerly Data Science Experience), and after logging in, click copy to import the notebook into a project for use.

Use data from IoT devices with the WatsonIoT operator

Another common data source is data from Internet of Things (IoT) devices. These are ingested in Watson Studio Streams Flows using the WatsonIoT operator.
Watch this video to learn how to use it.

Download the complete application from Github

Computing moving averages and running totals with the Aggregation operator

You may have noticed in the example flow that the Aggregation operator was used to compute general statistics like averages, max/min, totals, and so on. Learn more about the Aggregation operator and how to use it in this post.

Add custom code using the Python operator

Your application might require customized logic for tuple processing, or you might want to connect to a different database that isn’t currently supported as a source or target operator, such as Cassandra.
You could do so using the Python code operator, and this video shows how to do that.

Download the complete application from Github

Useful Links

We are still working on more content, so stay tuned!

Join The Discussion

Your email address will not be published. Required fields are marked *