Behind every great event is some pretty great code. And that is no different for the Masters golf tournament at Augusta National Golf Club.

IBM and Augusta National wanted a dedicated space to better tell the story of innovation at the Masters over the years. So they built a 12’ x 20’ room, with a 7-foot, 270-degree high-definition video screen that virtually puts you on the golf course, with the sounds of birds in your ears and the feel of turf on your feet.

But this is not your ordinary immersive experience. This is a cognitive experience.

Machine learning. Visual recognition. Speech-to-text. Cognitive computing. All in action at the Masters tournament in Augusta, Georgia.

Watson recognizes people when they walk in. It listens to the conversation in the room. And it surrounds you with information that educates, informs, and enhances the overall experience.

Bluemix
Full documentation detailing how to get started using this service in Bluemix is available for each Watson service

With Watson, the room itself becomes an active participant in the action, tailoring the lighting, sound and experience to suit the audience’s needs. It can even contribute relevant information that advances the conversation.

The technology used to create this cognitive space at the Masters is beyond cutting edge; some of it is downright experimental. In particular Project Intu, an openly available but experimental service on IBM Bluemix, has been key in the orchestration of this cognitive experience. This agent-based architecture is able to take raw, unstructured data from sensors such as microphones or cameras, and intuitively send it to the appropriate Watson service, allowing the cognitive space to interact with the user.

Here is an overview of Project Intu, and a link to the Intu SDK on GitHub. And here are three video blogs on working with Intu, including a piece on setting up Watson Conversation on Intu, creating a microphone sensor, and building emotional behaviors into the Intu platform.

Project Intu
Project Intu: Embed cognitive functions in various form factors such as spaces, avatars or other IoT devices.

There were four Watson APIs used in building the cognitive room experience at the Masters. First, the room is trained to recognize certain tour guides the moment they enter. A camera is hidden in the ceiling to accomplish this. And Watson is fed training data to properly identify tour guides. The link to Watson Visual Recognition APIs, SDKs, and documentation is here.

In addition, the tour guide can interact with the room using the spoken word. He or she can ask for some particular part of the experience to run. Or they can even ask questions about the golf tournament in progress, like “Where is Phil Mickelson right now?” to which Watson would respond with detailed information.

To handle these requests, Intu receives the incoming communication and shepherds it through three different Watson services. First it translates the speech into text so that it can by analyzed. Then, Watson discerns the meaning of the text and determines the appropriate response. And finally, that text response is translated back into speech and spoken to the room.

There are three Watson APIs used for this interaction:

The applications of this technology are nearly limitless. For example, you could imagine Watson participating in a board-level discussion of an acquisition or merger, using Tradeoff Analytics to weigh the pros and cons of the transaction. Or offering the latest research to doctors in the process of diagnosing a patient’s symptoms. Or even analyzing statistics and cueing up film during halftime of an NFL game.

Augusta National is a special place. And the Masters is a one-of-a-kind event. So it’s only fitting that IBM would build a first-of-a-kind experience to communicate the uniqueness of the tournament to members and their guests. And in so doing, introducing the power of Watson and cognitive rooms to the world at large.

.
.

Join The Discussion

Your email address will not be published. Required fields are marked *