In the following article, I will be sharing my experiences of building a chatbot using Watson for a live banking use case.
Chatbots and AI systems
Chatbots or virtual digital assistants have become very popular since a year or so, and is a typical use case of an AI based system. It tries to mimic human agent in understanding and providing responses in a natural language, often in a particular domain.
The domain of natural language understanding in AI is yet to reach the level of human understanding and nuances of language, culture and human thinking.
Besides the fact that machine learning and AI systems are probabilistic, inference based and dependent on available data, so accuracy is only with a certain level of confidence.
However a lot of enterprises have embarked on an AI journey starting with deploying chatbots as it gives an enhanced user experience, reduction in call center queries, self-service and a feel of human touch.
Also unlike traditional software projects, AI system development follows an iterative process consisting of continuous training, testing, feedbacks, design and deployment.
An AI system is never 100% accurate, it only evolves in its accuracy of understanding.
Development Journey (Approach and learnings)
The entire development can be divided into phases which may not correspond to a traditional software development model.
The journey started with a proof of concept stage, then a detailed design and development, followed by iterative testing/training and finally integration/deployment to Production.
Scope: The Scope was to implement a chatbot capable of answering 700+ FAQ’s on the Bank’s products and services. If we include the variations of these questions, it translates to a bot capable of answering more than 3000 questions.
Watson Conversation is an API which is the main engine of building chatbots. It has 3 main components – Intents, Entities and Dialog which are used to understand and respond to a natural language input.
- Intents: It is used to map the intention behind a user question. Mapping what a user “asks” to what a user “means”. There could be different ways in which a user could ask the same thing. All such variations are classified as one single intent.
- Entities: The entity refers to specific keywords within the domain or a company product/offering/services. These are usually nouns and its synonyms.It is used to extract or lookup certain words within a user question to further understand the question beyond the intent.
- Dialogs: Dialog is a flow chart containing nodes which are executed for a user input. The nodes are conditional blocks which checks for intent and/or entities. If the condition is satisfied, then the response is given. For every expected or defined response, a dialog node is created which contains the response.
Watson Discovery API allows fetching information from documents (PDF’s/word/html) using a natural language query or a structure parameter based query. It can fetch information by passing natural language query or parameter based query (passing entities/keywords as filters).
For more information on Watson APIs, you may click here
The architecture depicted above comprises of three layers. The front end systems (Web browser/Mobile App), Middleware Orchestrator (REST API) and the Backend (Watson Cloud API’s).
Front End Systems – These represent the existing portals, mobile apps of the customer which would be used for rendering chat conversation. This is hosted in the enterprise data center. It contains the Chat interface through which users interact with the Watson API. This chat interface takes the input and displays the response. It calls the orchestrator for sending any request/response.
Middleware (Orchestrator) – An Orchestrator is a REST API which contains logic to invoke the Watson API. This component handles the requests/ responses from/ to the front end systems and enables the call to backend Watson API in the IBM Cloud. It also handles any communication with any Enterprise/ third party API’s. The orchestrator resides in the enterprise Data Center.
Backend (Watson API’s) – These are the Watson API’s that are hosted on IBM Cloud. These API’s are invoked by the Orchestrator. The Intent, Entity and Dialog are configured using these API’s. Watson Conversation and Watson Discovery are the API’s used.
Development Process: (An agile approach)
- Initial content collation and training
- User feedback and Continuous training
- Accuracy measurement and improvement
- Changes and continuous evolution of FAQ content are part of the process.
The bot was trained initially with the FAQ content available on websites. Based on the available FAQ content, the design of intents and entities and dialog flow for conversation API was done. Discovery API was trained with the FAQ’s documents uploaded to it and by using ranked training of the questions and answers.
As part of development process, user feedback based testing approach was used. In this approach, the business/testing team checks every question along with variations of those questions to see if the bot provides correct response. The testing results captured by the business/testing teams was used to further train and improve the bot. This approach provides lot of real user samples and additional data for training.
Accuracy of the bot can be measured by checking correct responses/total test cases. Initially chatbot accuracy was around 60%. With continuous testing/feedback and retraining mechanism, more than 80 % was achieved.
Best practices of Watson API’s
- Initial design: For every user question/answer, the intents and entities should be identified before configuring on the conversation API. It is good to create a list of probable intents/entities keeping in mind the enterprise use. Intents are typically actions/processes while entities are mostly products/services.
- Intent training criteria:The intent should be defined such that it clearly brings out the purpose/goal of the question. Intents should not be too narrow (very specific) neither too broad (generic). While training an intent, minimum 5-10 variations(examples) should be provided, the more the better.
- Natural language vs keyword search: Intents are capable of learning from natural language examples as opposed to keywords matching. This means if we train an intent with few examples, it is capable of understanding additional examples not seen before.
- Entities design: Entities are typically nouns or a set of possible values. eg. products, list of actions, services, list of responses (positive/negative), bank names etc. Use as many synonyms possible for a given word. Now with fuzzy matching available, enable this feature to take care of small spelling mistakes.
- Context variables: Use context variables to maintain the context of the conversation e.g product/service being talked about/ the current customer logged in/ capturing user preferences or the source of request (portal/mobile). This helps to provide an appropriate response based on the context of the conversation.
- Handling failure scenarios: Even with correct intent being identified from an input sentence, it can lead to a wrong response if entity and dialog are not appropriately used
- Wrong entity: Lets says we have an intent for #lostcard, someone could type in “I lost my bag”, in this case intent is correct, so bot could possibly give a wrong response. To handle this, the response should only be given if expected entities are found in the input (e.g debit/credit cards) and an appropriate response for any other value.
- Outside entity: Someone could type “need to open an account with XYZ bank” where XYZ is a bank other than the enterprise. This can be handled by having a list of possible banks/enterprises and capturing them in the input and providing an appropriate response.
- False positive: Intent is #accountopening, someone could type “I don’t want to open an account”. This can be handled by having an entity which captures negation (not, don’t etc.) and use to provide appropriate response.
- Intelligent chat through dialog flow: Conversations API is best fit for a transactional and dialog based flows and short tail questions with clear intents identified. With customer profile and transaction data available at the backend and by using machine learning and analytics capabilities, the dialog flow can be configured to ask relevant questions to the user, making product recommendations and offers besides usual transactions like payments, balance, password resets. This gives the experience of a real intelligent agent to the end user.
- Example Use cases: Recommending products, making offers, making payments, getting balances, password resets, fund transfers.
- Discovery API is best suited as an information retrieval system where content from User manuals/documents need to be fetched.
- Questions where intent are not very clear or are long with multiple intents are trained in Discovery API. It can act as a fallback mechanism when the confidence of response from Conversation API is not high.
- The discovery language query allows to query by passing entity/keywords as filters. This is a powerful feature to retrieve only relevant content from a document.
Other considerations (User interface, feedback and Security)
- Feedback of the chat response is captured through thumbs up and thumbs down button. This feedback is then used to further improve the response/accuracy of the bot.
- All user conversations logged in a NoSQL Cloudant database for feedback/intent correction and generating Analytics/Dashboards on chatbot usage
- In order to comply with regulatory requirements with regards to sending data to cloud, customer personal information can be masked at the application layer using Regex patterns to identify PII.
The true power of virtual assistants will be realised when it is fully capable of understanding the nuances of natural language , powered by machine learning and having a personalised and contextual conversation with a customer. It will be a continuous journey till it reaches this advanced stage and keeps on improving.