Overview
Given a body of text (context) about a subject and questions about that subject, the model will answer questions based on the given context.
The model is based on the BERT model.
Model Metadata
Domain | Application | Industry | Framework | Training Data | Input Data Format |
---|---|---|---|---|---|
Natural Language Processing (NLP) | Question and Answer | General | TensorFlow | SQuAD 1.1 | Text |
References
- J. Devlin, M. Chang, K. Lee, K. Toutanova, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding”, arXiv, 2018.
- Google BERT
- SQuAD Dataset and version 1.1 on the Google BERT repo
Licenses
Component | License | Link |
---|---|---|
Model GitHub repository | Apache 2.0 | LICENSE |
Fine-tuned Model Weights | Apache 2.0 | LICENSE |
Pre-trained Model Weights | Apache 2.0 | LICENSE |
Model Code (3rd party) | Apache 2.0 | LICENSE |
Options available for deploying this model
This model can be deployed using the following mechanisms:
- Deploy from Dockerhub:
docker run -it -p 5000:5000 codait/max-question-answering
- Deploy on Kubernetes:
kubectl apply -f https://raw.githubusercontent.com/IBM/MAX-Question-Answering/master/max-question-answering.yaml
A more elaborate tutorial on how to deploy this MAX model to production on IBM Cloud can be found here.
- Locally: follow the instructions in the model README on GitHub
Example Usage
You can test or use this model
Test the model using cURL
Once deployed, you can test the model from the command line. For example if running locally:
curl -X POST "http://localhost:5000/model/predict" -H "accept: application/json" -H "Content-Type: application/json" -d "{\"paragraphs\": [{ \"context\": \"John lives in Brussels and works for the EU\", \"questions\": [\"Where does John Live?\",\"What does John do?\",\"What is his name?\" ]},{ \"context\": \"Jane lives in Paris and works for the UN\", \"questions\": [\"Where does Jane Live?\",\"What does Jane do?\" ]}]}"
{
"status": "ok",
"predictions": [
[
"Brussels",
"works for the EU",
"John"
],
[
"Paris",
"works for the UN"
]
]
}
Test the model in a notebook
The demo notebook walks through how to use the model to answer questions on a given corpus of text. By default, the notebook uses the hosted demo instance, but you can use a locally running instance.
Run the following command from the model repo base folder, in a new terminal window:
jupyter notebook
This will start the notebook server. You can launch the demo notebook by clicking on samples/demo.ipynb
.
Options available for training this model
This model can be trained using the following mechanisms:
- Train on IBM Cloud – Watson Machine Learning: follow the instructions in the model training README on GitHub.
Resources and Contributions
If you are interested in contributing to the Model Asset Exchange project or have any queries, please follow the instructions here.