Skill Level: Beginner

Demystify the jargon and fears surrounding AI, cognitive computing, and other relevant technical terms!

Created by IBM's GWC Interns


  1. Introduction

    Did you know that 12 million American patients are misdiagnosed each year? Our medical practices are highly advanced, but also very far from perfect.

    What if we could use technology to change that?

  2. What if…?

    Say that there was a system out there that could collect not only a patient’s health data, but also the data of those who showed similar symptoms. Imagine how much more information doctors would have at their fingertips!

    A system like this exists and runs on what’s called “machine learning,” or ML. ML is unique in the way that it allows the system to learn from data without explicit programming. This means that not only does it save time, but it’s also able to generate more accurate outcomes, purely because of its ability to access and analyze more data than a human ever could.

    And believe it or not, this is only one way that ML can be utilized in health care! With all of this data, the possibilities are endless–we can help prevent diseases and even readmittance to hospitals.

    If you’re still unsure about what ML is, watch this video!

  3. Can you dig it?

    Machine-learning means that computers can digest data like online text, but what about conversations? If a teenager uses slang and euphemisms, would the machine be able to interpret that like it would in a formal document? What about jargon?

    In short, yes. Computers with natural language processing capabilities can understand not only the nuances of language, but also different languages as well! Whether you’re speaking about the law with your friends or explaining how ‘sick’ a new song is, it’s all the same. Natural language processing is like reading comprehension. The system recognizes the message and sifts through its given body of information to identify the most likely meaning of the message before responding.

    Visit this infographic to see how it works!

  4. Hi, AI!

    Now, before you click away in fear of robots taking over the world, let’s be educated in our discussion of AI. AI is commonly referred to as artificial intelligence, but let’s think of it as augmented intelligence. The term augmented intelligence reframes the way we think about machine learning to focus on the development of computers to assist or enhance professionals rather than to replace humans.

    AI isn’t actually that new. In 1943, the concept of a machine with neural networks like those of humans was first published. Since then, AI has come along way. From the use of algorithms to predict the estimated time of arrival on your way to school, to smart email categorization that separates promotion emails from important ones, to online shopping recommendations and social media facial recognition filters, AI has found its place in many industries and in our lives.

  5. AI Applied

    Now that we’re clear on augmented intelligence instead of artificial intelligence, cognitive computing refers to the application of AI.

    Here’s an example of how this all comes together in cancer diagnosis:

    “Leveraging machine learning to extend the ability of the oncologist to diagnose cancer efficiently by helping him discover diagnosis patterns that he may not observe all by himself is cognitive computing. Cognitive computing refers to this end-end ecosystem that has machine learning as a part of it.”

  6. Elementary, My Dear Watson!

    Watson is a perfect example of how this all comes together. Watson is IBM’s cognitive computing platform that allows users to create with preset APIs and tools. To learn more about Watson, click here or visit this video!

  7. Harness Tech's Power

    Let’s recap with this quick video!

    We’ve covered cognitive computing, machine learning, augmented intelligence, and Watson. Now that we understand the tech, how will you harness its power to change the world?

Join The Discussion