(by Scott Chapman)

Recently I’ve been doing a lot of work with IBM Cloud Functions to show how easy it is to build Watson Work applications and not ever have to worry about how you are going to run them in the cloud. Truth be told I’ve also been working on a IBM Cloud Functions framework that simplifies much of the Watson Work application development. With everything in place, I wanted to demonstrate how easy it is to leverage the cognitive capabilities of Watson Work in a Serverless architecture.

For my cognitive application I decided to implement a Weather-bot. The idea is pretty simple, provide accurate detailed weather information when the user wants it. To do this we need to:

  • Understand when the user wants to know about the weather (intent recognition).
  • Understand the location the user is interested in (location entity recognition).
  • Optionally understand the time frame the user is interested in (date entity recognition).

I’ve made this code available as a Sample in my IBM Cloud Function Template for Watson Work.

What are you saying?

So we can’t do anything cognitive unless we understand what the user wants. To get an idea of what the user is talking about we can leverage Watson Work’s ability to integrate with Watson Assistant. With this integration we can teach Watson to recognize what we are looking, let me explain a little bit about what is going on here…

With Watson Assistant you can teach Watson how to recognize intents (what users are saying) by providing some training data (examples of those intents). So for this example I have a single intent #weather and a bunch of different examples of how a user might express that they want to know about the weather. While it is important that you provide realistic examples, you don’t need to exhaustively define all possible ways to request weather data. This is because Watson Assistant will generalize and learn from your examples.

Watson Work has the concept of Focuses. Focuses are messages that are of significance to the user, and allows the user to focus on them. For example, out of the box Watson Work recognized Questions, Actions, and Commitments as Focuses. Focuses manifest in the Moments view of Watson Work in an effort to provide a concise summary of significant messages within a larger cluster of messaged. As a developer you can extend what Watson Work recognizes as significant messages by making your Watson Work app Cognitive. By registering you Watson Assistant trained to recognize requests for weather information in your Watson Work application Watson Work will “understand” those requests in messages sent by users. They will manifest as focuses, and also be visible in moments.

That’s super cool and there is no limit to the kinds of information you can recognize in text messages! But, there’s more…

When an intent is recognized you often need to know some details about the intent, in our example we need to know if the user provided a location (we can’t get a forecast without knowing where!), and maybe the user wants currently conditions, or a forecast in the future. Watson Work will provide these details in the form of Entities. You will see in my example that I have asked the Watson Assistant to include @sys-location and @sys-date system entities.

Putting this all together Watson Work will attach Focus annotations to weather requests along with location and date entities (including ranges like “next week”, and “this weekend”) as extracted information. You can see the structure of that annotation in the documentation. All our application needs to do is listen for the message-annotation-added events, check to see if the annotation is a focus, and if it is our #weather focus and if the necessary entities exist (location required, date/date range optional) and generate a response back to the space. The resulting flow looks like this:

forecast sequence

You can see this manifested in the only code that I needed to write for this application. What that code does is:

  1. Make sure the focus is a #WeatherForecast focus.
  2. Validate that it has credentials for the Weather Service.
  3. Look for Location Entities.
  4. Look for Date Entities.
  5. If no location, then respond that we need a location in order to provide weather information.
  6. If no date, then provide current conditions.
  7. If there is a date, get a weather forecast and respond with dates within the range requested.

That’s it! Much of the additional code that is there is to use the weather service geoLocation to map the requested location to latitude/longitude coordinates for the forecast, determining the date range overlap, and validating input.

This post was originally published in Scott’s blog at https://medium.com/@scottedchapman.

Join The Discussion

Your email address will not be published. Required fields are marked *