Taxonomy Icon

Artificial Intelligence

ESPN and IBM have teamed up to bring a new level of insight to fantasy football team owners that correlates millions of news articles with traditional football statistics. Watson is built on an enterprise grade machine learning pipeline to read, understand, and comprehend millions of documents and multimedia sources about fantasy football. The ESPN Fantasy Football with Watson system has been a significant undertaking with many components.

This article is the seventh in an eight-part series that takes you behind each component to show you how we used Watson to build a fair, world-class AI solution.

AI Fantasy Football deep visualization

Fantasy Football analysts, editors, and producers at ESPN use deep visualizations to create content to provide you with insights for your team. The combination of human intelligence and machine AI with Watson provides unprecedented Fantasy Football advantages to help you win. Watson offers visualizations that show tradeoffs and ordinal rankings for each player week over week. The evidence is derived within the context of millions of news, podcast, and video artifacts. IBM Watson insights can be found in various content on ESPN.com, the ESPN Fantasy App, and the Fantasy Show with Matthew Berry. Visual evidential foundations create comprehensive opinion pieces for you to select your roster spots. #WinWithWatson.

The Watson machine learning pipeline reads, understands, and comprehends over 2 million news articles, videos, and podcasts per day. Over the first 9 weeks of the football season, Fantasy Football with Watson has provided over 12.2 billion insights for the millions of ESPN Fantasy players. The volume and variety of data is processed by three Python applications that are deployed on IBM Cloud as Cloud Foundry apps. Each of the Python applications processes a specific modality of content such as news. The insights that are produced are inserted into a highly available Db2 database on IBM Cloud. A Representational State Transfer (REST) service is called on a Node.js application to begin the process of creating data for the visualizations.

visual-representation-overview

When a Python job runs to update the evidence for a player, the Node.js application is called. Further, a cron job runs another IBM Cloud Function that calls the Node.js content generation application to update the dashboard data. The content generation application executes an IBM Cloud Function with a get request. The analytics update action is an IBM Cloud Function that executes Python logic using pandas and NumPy to aggregate AI insights into a viewable Db2 table. The Db2 table is a data source for visualizations that are presented in Cognos Dashboards. The Cognos Dashboards have a live connection to their data sources to provide users with timely data.

ESPN Fantasy Football with Watson dashboards are designed through Watson Studio. The visual displays are customized to include risk versus reward, boom versus bust, high and low trends, season summaries, and player compare for each player in the top 350 weekly rankings from ESPN. The data source and Cognos Dashboard Embedded server are associated with each new visualization page. Each of the pages is included into an iframe on a Node.js application. Access to the visualization page is controlled by federated IBM internal and external identity management.

Throughout the week, ESPN analysts and editorial staff use the visualizations to create stories. For example, an article “Fantasy football insights with Watson for Week 12” is available. The article provides a synopsis of defense decisions as well as players that are more likely to boom and least likely to bust. The combination of the domain knowledge from the ESPN analyst with the insights from ESPN Fantasy Football with Watson provides additional evidence around each quantitative prediction.

weekly-view

Each of the Functions as a Service (FaaS) on IBM Cloud executes functions written in code in response to incoming events. The function that computes derived information from the AI insights takes 1.11 minutes to execute. The light Node.js function has a 439 ms activation time. Over a span of 32 hours, the functions were called 100 times. A period of 6 hours per day blocks any function calls while every player is updating for the following day.

To create the analytics function with the IBM Cloud Command Line Interface (CLI), first create an action.

bx wsk action create analytics_update Archive.zip --kind python-jessie:3

The action can be viewed in the functions console on IBM Cloud.

actions

Next, a trigger can be created to run a specific function on a schedule. We created a cron job that runs every hour.

bx wsk trigger create hourly --feed /whisk.system/alarms/alarm --param cron "0 * * * *"

The trigger and action are associated together with a rule. The rule indicates which action is run by a schedule trigger, if any.

bx wsk rule create hourly_copy hourly analytics_update

The Node.js Content Generation Application has a RESTful endpoint called analyticsUpdate. The service is called from three Python applications that process news and multimedia. In addition, the IBM Cloud Function that is written in Node.js calls the Node.js Content Generation Application.

app.get("/analyticsUpdate", auth, function(req, res) {
   callAnalyticsWsk(req, res,sendRespsonseCallback);
});

The analyticsUpdate path then calls a function that ensures we only execute the IBM Cloud Function once to avoid any data locks. A get request with authorization credentials protects our function from attack.

function callAnalyticsWsk(req,res,callback){
  const authHeader = req.get('authorization');
  const options = {
      method: 'GET',
      url: "<url>",
      headers: {
          'authorization': <auth>,
          'env': <env>
      }
  };
  if (storedprocedure_running == 0) {
    storedprocedure_running = 1;
    request(options, (error, response, body) => {
      if (error) {
          console.error('Error sending request to analytics_update wsk:', error);
          switchoffSQLprocessflag();
      } else {
          console.log('analytics_update wsk response body', body);
          if (JSON.parse(body).error == 'Response not yet ready.') {

        setTimeout(() => switchoffSQLprocessflag(), 300000);
          } else switchoffSQLprocessflag();
      }
    });
    const attemptMsg = process.env.NODE_ENV + " - Attempted request to analytics_update wsk";
    console.log(attemptMsg);
    sendRespsonseCallback(res,{'msg':attemptMsg},"complete");
  } else {
    const runningMsg =  process.env.NODE_ENV + " - analytics_update wsk already running";
    console.log(runningMsg);
    sendRespsonseCallback(res,{'msg':runningMsg},"complete");
  }
}

The IBM Cloud Function written in Node.js is a simple example of using an event-driven function from a timer. The function code determines which environment the Python function should execute within. A request is sent to the Node.js Content Generation Application to update the visualization AI insights data.

function main(params) {
    if (envs['dev']) {
        envsArr.push('dev');
        request(options_dev, (error, response, body) => {
            if (error) {
                console.log(error);
            } else {
                console.log(body);
            }
        });
    }

    if (envs['prod']) {
        envsArr.push('prod');
        request(options_prod, (error, response, body) => {
            if (error) {
                console.log(error);
            } else {
                console.log(body);
            }
        });
    }

The Node.js Embedded Dashboard uses the pug template engine to include an iframe for the visualizations. First, the Node.js server is set up to use the pug engine and then to the views directory for pug files. The application uses the dashboards.js file to route requests to the appropriate dashboard.

var app = express();

// view engine setup
app.set('views', path.join(__dirname, '../', 'views'));
app.set('view engine', 'pug');

var dashboards = require(`./routes/dashboards.js`);
app.use('/dashboards', dashboards);

User traffic is sent to the fantasy-football route within the dashboard.js file. The url for the Watson visualization is set within the renderer.

router.get('/fantasy-football', authSetup.ensureAuthenticated, authSetup.checkUserAccess, track, function(req, res, next) {
  res.render('dashboards', { title: 'Fantasy Dashboards',
                             dashboard_url: <url>' });
});

The dashboards.pug file is resolved to the views directory. The template accepts the parameters and creates an HTML file with dashboard_url for ESPN Fantasy Football.

doctype html
html(style={margin: 0, padding: 0, height: '100%', overflow: 'hidden'})
  head
    title #{title}
  body(style={margin: 0, padding: 0, height: '100%', overflow: 'hidden'})
    iframe(width='100%' height='100%' frameborder=0 src=dashboard_url)

This blog has shown how ESPN Fantasy Football with Watson provides deep visualizations for ESPN broadcasters and editors. The player predictions and score distributions are highly precise so that you can select your best roster week over week. #WinWithWatson

Check back next time as I discuss the real-world use of ESPN Fantasy Football with Watson. To find out more, follow Aaron Baughman on Twitter: @BaughmanAaron.

The ESPN Fantasy Football logo is a trademark of ESPN, Inc. Used with permission of ESPN, Inc.