Analyze Twitter handles and hashtags for sentiment and content  

Create charts and graphs for sentiment, emotional tone, and keywords for Twitter handles and hashtags

Last updated | By Scott D’Angelo, Werner Vanzy

Description

Organizations are increasingly interested in their social media profile, and can derive insights into how they are perceived through analysis and classification. This pattern subscribes to Twitter screen names or hashtags and analyzes the content with the Watson Tone Analyzer and Natural Language Understanding (NLU), as well as the Watson Assistant API to classify (intents) the tweets. The enriched metadata is then saved to a Cloudant database, where Map Reduce functions are used to provide a high level insight into the data

Overview

In this pattern, our server application subscribes to a Twitter feed that is configured by the user. Each tweet received is analyzed for emotional tone and sentiment. The intent of the tweet is determined by the Watson Assistant service. All data is stored in a Cloudant database, with the opportunity to store historical data as well. The information is presented in a Web UI as a series of graphs and charts.

When you complete this pattern, you’ll learn how to:

  • Run an application that monitors a Twitter feed.
  • Send the tweets to Watson Tone Analyzer, Assistant, and Natural Language Understanding for processing and analysis.
  • Store the information in a Cloudant database.
  • Present the information in a Node.js web UI.

Flow

  1. Tweets are pushed out by Twitter.
  2. The Cognitive Social CRM app (server.js) processes the tweet.
  3. The Watson Tone Analyzer Service performs analysis of sentiment and emotional tone.
  4. The Watson Natural Language Understanding Service pulls out keywords and entities.
  5. The Watson Assistant Service extracts the intents (verbs) from the tweets.
  6. Tweets and metadata are stored in Cloudant
  7. The Web UI displays charts and graphs as well as the tweets.

Related Blogs

Two “edgy” AI TensorFlow models for you!

The global Call for Code is well underway, we want to share some visual recognition models which could help you. These AI models can operate on the edge, which could be particularly useful for this years’ theme: disaster preparedness. How could visual recognition help in relief work? From satellite and drone imagery analysis, to classifying...

Continue reading Two “edgy” AI TensorFlow models for you!

Leveraging the power of AI at Unite Berlin

Last week, from June 19 – 21, we were at Unity’s premiere in Berlin: Unite 2018. This conference brought together Unity’s video game and development community. Unity touches 770 million gamers all over the world and is the market leader for consumer AR and VR use cases and is also rapidly emerging as the market...

Continue reading Leveraging the power of AI at Unite Berlin

Related Links

Architecture center

Learn how this code pattern fits into the Cognitive Conversation Reference Architecture