IBM iX and Corporate team project members include: Brian Adams, Aaron Baughman, Karen Eickemeyer, Monica Ellingson, Stephen Hammer, Eythan Holladay, John Kent, William Padget, David Provan, Karl Schaffer, Andy Wismar


+ This content is part 4 of the 4 part series that describes AI Highlights at the Masters Golf Tournament.

Patrons from the around the world have an insatiable demand for content from the iconic Augusta National Golf Club. This year’s tournament will draw particular interest as Tiger Woods attempts a come back while Sergio Garcia defends his title. Perhaps Rickie Fowler will win his first major or Justin Rose will come back from his playoff loss in 2017. The personal stories around each golf player demand their own fan fare.

With the AI Highlights and Ranking system, patrons can follow custom video streams and content that tells the story of each player throughout the tournament. Automatic highlights that are ranked within different components of excitement such as commentator tone, player gesture and crowd roar are discoverable through our system. The Content Management System is the central repository for the management of highlights with support from Cloudant and Object Storage. Patrons who access the tournament digital platforms such as the Masters website or mobile app will be able to customize the delivery of highlight based content. They will be able to view the best shots from their favorite highlights through the “My Experience Application Programming Interface”.

Video Highlights in Content Management System

Our onsite video ingest application can poll another system for updates or receive content. We make use of JAX-RS to create a RESTful API implementation using Liberty 17.0.0.4. The application receives a JSON post from the AI Highlight system when the crowd noise, commentator tone, speech to text and gesture recognition is complete. The content of the highlight is saved to a Cloudant Database.


overall_excitement_avg = np.average(np.array(overall_excitement_metrics))
gesture_avg = np.average(np.array(gesture_metrics))
crowd_avg = np.average(np.array(crowd_metrics))
commentator_text_avg = np.average(np.array(commentator_text_metrics))
commentator_tones_avg = np.average(np.array(commentator_tones_metrics))
doc_retrieved["cognitiveInsights"] = {"overallMetric":overall_excitement_avg,"gestureMetric":gesture_avg,"scoringMetric":0,"crowdMetric":crowd_avg,"commentatorText":commentator_text_avg,"commentatorTone":commentator_tones_avg}
r = requests.post(self.cms_link, json=doc_retrieved)                 self.cloudant_fifo_highlights.insert_document(doc_retrieved)

The system was designed to auto-ingest content. During this process, we use a regex pattern on filenames to identify and categorize the videos that are stored in Cloudant. We use a software engineering technique, reflection, within Liberty to initialize the correct parser class to determine the process flow of the video. If the routing is a ranked highlight clip, we add the content into our review queue on the Content Management System (CMS).

Human Review and Approve

A human reviewer examines a queue of candidate highlights in the IBM AI Highlights Approval queue. The queue can be sorted by player, round or hole to expedite the approval of a certain group of highlights. If the highlight does not contain any fades such as dissolves or cross cuts then an appropriate clip was generated. Next, the reviewer determines if the highlight was exciting. A combination of domain expertise and the use of the AI Highlight scores provide enough evidence for the final approval. When approved, the highlight should be relevant, exciting and visually appealing.


 

We created several views within the database to to allow our review team to quickly capture new ranked highlight clips.


function (doc) {
  if((doc.clipType == 'cognitive' || doc.clipType == 'shot_highlight') && doc.approved == "false" &&    
  doc.displayState != 'unapproved'){
    emit(doc._id, 1);
  }
}

 

Another view allows the reviewers to access clips by player, round or hole so that they can prioritize the highlight clips to examine.


function (doc) {
  if((doc.clipType == 'cognitive' || doc.clipType == 'shot_highlight') && doc.sortDate.substring(0, 4) == 
  "2018"){
    if(doc.tags && doc.tags.tag){
      var playerId = '';
      var roundId = '';
      var holeId = '';
      for(var i = 0 ; i < doc.tags.tag.length; i++){
        if(doc.tags.tag[i].type == 'hole'){
          holeId = doc.tags.tag[i].key;
        }
        if(doc.tags.tag[i].type == 'round'){
          roundId = doc.tags.tag[i].key;
        }
        if(doc.tags.tag[i].type == 'players'){
          playerId = doc.tags.tag[i].key;
        }
      }
      emit([playerId, roundId, holeId], 1);
    }
  }
}

After approval, the highlight is pushed to the My Experience API along with an overall excitement score. The overall excitement score is computed by a linear equation that combines all of the recognition scores. The weights are predetermined before an event.


Double overall = (gestureVal * gestureWeight) + (crowdVal * crowdWeight) + (commentatorVal * commentatorWeight) + (scoringVal * scoringWeight); 

As a result, any query through the My Experience RESTful API will be able to access the approved clip for patron consumption.

My Experience API

The API for My Moments receives three parameters that help the system retrieve ranked highlights. The system expects an epoch time that represents the last time a user called the API, a list of favorite player identifiers and the number of ranked highlights to return. With the three parameters, the system queries the database for the top overall and player highlights of the day that is before the previous search epoch time. The highlights are sorted based on the overall excitement level.


private class HighlightSorter implements Comparator {
        @Override
        public int compare(CognitiveVideo o1, CognitiveVideo o2) {
            Double o2Excitement = new Double(0.5);
            Double o1Excitement = new Double(0.5);
            if (o1.getExcitementScore() != null) {
                o1Excitement = o1.getExcitementScore();
            }
            if (o2.getExcitementScore() != null) {
                o2Excitement = o2.getExcitementScore();
            }
            return Double.compare(o2Excitement, o1Excitement);
        }
    }

To help our users find the most relevant highlights, our API excludes clips below a predetermined excitement level and allows us to increase the importance of favorite player highlights relative to the daily highlights. We have been able to monitor and improve the API’s results and relevance during live play at the Masters.

Editor’s Dashboard

Video editors and producers create content based on the most dramatic and climatic moments of the Masters golf tournament. When they produce a story, the content producers have access to the AI Highlights through our web-enabled dashboard that runs on the IBM Cloud. Highlights can be found by player name, hole number and round through the My Experience API. The user experience shows a sorted highlight view by overall excitement within a multimedia mosaic explorer. As the producers select a highlight to view, the cut highlight is pulled from Object Storage and played through a web browser. Along with the video, the AI Highlight rankings from the CMS are displayed in concentric circles. The depicted features from the AI Highlights system include the crowd roar, commentator excitement and player gestures. For example, if an editor is looking for an exciting golf shot with a player pumping their fist, they can look at the player gesture score.


AI Listens, Watches and Reads the Masters Golf Tournament Action

+ This content is part 4 of the 4-part series that describes AI Highlights at the Masters Golf Tournament.

Join The Discussion

Your email address will not be published. Required fields are marked *