Note: This pattern is part of a composite pattern. These are code patterns that can be stand-alone applications or might be a continuation of another code pattern. This composite pattern consists of:
- Extend Watson Text Classification
- Correlate documents from different sources
- Client network banking (this pattern)
Knowing your client is an essential best practice because it is the foundation for all succeeding steps in the credit risk management process. To be successful, you must operate on pertinent, accurate, and timely information. However, client network information is scattered across various sources. This pattern provides real-time information regarding a client, known as a client-network, all collated in a single place. It’s targeted at relationship managers at banks who handle customer investments.
Relationship managers at banks handle client investments. And one of the most important considerations in investing client money for a financial advisor is trying to assess the client’s risk due to certain changes in their environment. In addition, investments are affected by happenings in the ecosystem or client network with events such as Management Change, Management Default, Share Price Deviations, Credit Rating, Strike, and more.
This pattern provides real-time information regarding a client, known as a client-network, all collated in a single place. This information is in compliance with the most important events impacting any organization. It takes real-time information from popular news sites and extracts the clients affected by it with the help of Watson Natural Language Understanding. The application demonstrates a methodology to derive insights about customer insights with IBM Cloud, Watson services, Python Flask and Python NLTK.
- The user interacts with the app UI to request relevant information corresponding to an event or a client.
- The web app UI interacts with the Python-Flask server to receive the required information from the appropriate API.
- The flask APIs scrape real-time news from popular online news portals.
- The scraped data is sent to NLU Studio to extract important entities.
- A configuration JSON file is sent into the flask app, to further prune on the results obtained on NLU.
- All the collected information is pushed back into the interactive UI.
Get the detailed instructions in the README file. These steps show you how to:
- Clone the repo.
- Create IBM Cloud service.
- Set up the application on IBM Cloud.
- Set up the application on your Localhost.
- Running the Python application.