haystack/tutorials/Tutorial5_Evaluation.ipynb

1238 lines
102 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Evalutaion\n",
"To be able to make a statement about the performance of a question-asnwering system, it is important to evalute it. Furthermore, evaluation allows to determine which parts of the system can be improved."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Start an Elasticsearch server\n",
"You can start Elasticsearch on your local machine instance using Docker. If Docker is not readily available in your environment (eg., in Colab notebooks), then you can manually download and execute Elasticsearch from source."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"a844e3ec4f41b5d2b24fe3d562e8302896baea1d0a761295998434c2de490714\r\n"
]
}
],
"source": [
"# Recommended: Start Elasticsearch using Docker\n",
"! docker run -d -p 9200:9200 -e \"discovery.type=single-node\" elasticsearch:7.6.2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# In Colab / No Docker environments: Start Elasticsearch from source\n",
"#! wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.6.2-linux-x86_64.tar.gz -q\n",
"#! tar -xzf elasticsearch-7.6.2-linux-x86_64.tar.gz\n",
"#! chown -R daemon:daemon elasticsearch-7.6.2\n",
"\n",
"#import os\n",
"#from subprocess import Popen, PIPE, STDOUT\n",
"#es_server = Popen(['elasticsearch-7.6.2/bin/elasticsearch'],\n",
" stdout=PIPE, stderr=STDOUT,\n",
" preexec_fn=lambda: os.setuid(1) # as daemon\n",
" )\n",
"# wait until ES has started\n",
"#! sleep 30"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:03:25 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n"
]
}
],
"source": [
"from farm.utils import initialize_device_settings\n",
"\n",
"device, n_gpu = initialize_device_settings(use_cuda=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [
"\n",
"from haystack.indexing.io import fetch_archive_from_http\n",
"\n",
"# Download evaluation data, which is a subset of Natural Questions development set containing 50 documents\n",
"doc_dir = \"../data/nq\"\n",
"s3_url = \"https://s3.eu-central-1.amazonaws.com/deepset.ai-farm-qa/datasets/nq_dev_subset.json.zip\"\n",
"fetch_archive_from_http(url=s3_url, output_dir=doc_dir)"
],
"metadata": {
"collapsed": false,
"pycharm": {
"name": "#%%\n"
}
}
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"# Connect to Elasticsearch\n",
"from haystack.database.elasticsearch import ElasticsearchDocumentStore\n",
"\n",
"document_store = ElasticsearchDocumentStore(host=\"localhost\", username=\"\", password=\"\", create_index=False)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:03:37 - INFO - elasticsearch - POST http://localhost:9200/_bulk [status:200 request:0.796s]\n",
"05/19/2020 09:03:38 - INFO - elasticsearch - POST http://localhost:9200/_bulk [status:200 request:0.222s]\n"
]
}
],
"source": [
"# Add evaluation data to Elasticsearch database\n",
"document_store.add_eval_data(\"../data/natural_questions/dev_subset.json\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize components of QA-System"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"# Initialize Retriever\n",
"from haystack.retriever.elasticsearch import ElasticsearchRetriever\n",
"\n",
"retriever = ElasticsearchRetriever(document_store=document_store)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:03:46 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n",
"05/19/2020 09:03:46 - INFO - farm.infer - Could not find `deepset/roberta-base-squad2` locally. Try to download from model hub ...\n",
"05/19/2020 09:03:50 - WARNING - farm.modeling.language_model - Could not automatically detect from language model name what language it is. \n",
"\t We guess it's an *ENGLISH* model ... \n",
"\t If not: Init the language model by supplying the 'language' param.\n",
"05/19/2020 09:03:56 - WARNING - farm.modeling.prediction_head - Some unused parameters are passed to the QuestionAnsweringHead. Might not be a problem. Params: {\"loss_ignore_index\": -1}\n",
"05/19/2020 09:04:02 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n",
"05/19/2020 09:04:02 - INFO - farm.infer - Got ya 7 parallel workers to do inference ...\n",
"05/19/2020 09:04:02 - INFO - farm.infer - 0 0 0 0 0 0 0 \n",
"05/19/2020 09:04:02 - INFO - farm.infer - /w\\ /w\\ /w\\ /w\\ /w\\ /w\\ /w\\\n",
"05/19/2020 09:04:02 - INFO - farm.infer - /'\\ / \\ /'\\ /'\\ / \\ / \\ /'\\\n",
"05/19/2020 09:04:02 - INFO - farm.infer - \n"
]
}
],
"source": [
"# Initialize Reader\n",
"from haystack.reader.farm import FARMReader\n",
"\n",
"reader = FARMReader(\"deepset/roberta-base-squad2\")"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"# Initialize Finder which sticks together Reader and Retriever\n",
"from haystack.finder import Finder\n",
"\n",
"finder = Finder(reader, retriever)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Evaluation of Retriever"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/feedback/_search?scroll=5m&size=1000 [status:200 request:0.090s]\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.051s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.013s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.012s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.012s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.013s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.010s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.010s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.009s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.011s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.010s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.010s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.009s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.009s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:11 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.010s]\n",
"05/19/2020 09:04:11 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.009s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - GET http://localhost:9200/_search/scroll [status:200 request:0.011s]\n",
"05/19/2020 09:04:12 - INFO - elasticsearch - DELETE http://localhost:9200/_search/scroll [status:200 request:0.005s]\n",
"05/19/2020 09:04:12 - INFO - haystack.retriever.elasticsearch - For 59 out of 59 questions (100.00%), the answer was in the top-10 candidate passages selected by the retriever.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Retriever Recall: 1.0\n",
"Retriever Mean Avg Precision: 0.984934086629002\n"
]
}
],
"source": [
"# Evaluate Retriever on its own\n",
"retriever_eval_results = retriever.eval()\n",
"\n",
"## Retriever Recall is the proportion of questions for which the correct document containing the answer is\n",
"## among the correct documents\n",
"print(\"Retriever Recall:\", retriever_eval_results[\"recall\"])\n",
"## Retriever Mean Avg Precision rewards retrievers that give relevant documents a higher rank\n",
"print(\"Retriever Mean Avg Precision:\", retriever_eval_results[\"mean avg precision\"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Evaluation of Reader"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:04:22 - INFO - elasticsearch - GET http://localhost:9200/feedback/_search?scroll=5m&size=1000 [status:200 request:0.007s]\n",
"05/19/2020 09:04:22 - INFO - elasticsearch - GET http://localhost:9200/_search/scroll [status:200 request:0.003s]\n",
"05/19/2020 09:04:22 - INFO - elasticsearch - DELETE http://localhost:9200/_search/scroll [status:200 request:0.001s]\n",
"05/19/2020 09:04:22 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search?scroll=5m&size=1000 [status:200 request:0.014s]\n",
"05/19/2020 09:04:22 - INFO - elasticsearch - GET http://localhost:9200/_search/scroll [status:200 request:0.002s]\n",
"05/19/2020 09:04:22 - INFO - elasticsearch - DELETE http://localhost:9200/_search/scroll [status:200 request:0.002s]\n",
"Evaluating: 100%|██████████| 64/64 [00:14<00:00, 4.28it/s]"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Reader Top-N-Recall: 0.5084745762711864\n",
"Reader Exact Match: 0.23728813559322035\n",
"Reader F1-Score: 0.23728813559322035\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\n"
]
}
],
"source": [
"# Evaluate Reader on its own\n",
"reader_eval_results = reader.eval(document_store=document_store, device=device)\n",
"\n",
"# Evaluation of Reader can also be done directly on a SQuAD-formatted file \n",
"# without passing the data to Elasticsearch\n",
"#reader_eval_results = reader.eval_on_file(\"../data/natural_questions\", \"dev_subset.json\", device=device)\n",
"\n",
"## Reader Top-N-Recall is the proportion of predicted answers that overlap with their corresponding correct answer\n",
"print(\"Reader Top-N-Recall:\", reader_eval_results[\"top_n_recall\"])\n",
"## Reader Exact Match is the proportion of questions where the predicted answer is exactly the same as the correct answer\n",
"print(\"Reader Exact Match:\", reader_eval_results[\"EM\"])\n",
"## Reader F1-Score is the average overlap between the predicted answers and the correct answers\n",
"print(\"Reader F1-Score:\", reader_eval_results[\"f1\"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Evaluation of Finder"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/feedback/_search?scroll=5m&size=1000 [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:57 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:57 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.008s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.007s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.004s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.004s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.005s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/eval_document/_search [status:200 request:0.006s]\n",
"05/19/2020 09:04:58 - INFO - haystack.retriever.elasticsearch - Got 10 candidates from retriever\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - GET http://localhost:9200/_search/scroll [status:200 request:0.003s]\n",
"05/19/2020 09:04:58 - INFO - elasticsearch - DELETE http://localhost:9200/_search/scroll [status:200 request:0.001s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.40 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.91 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.53 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.48 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.19 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.77 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.51 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.69 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.33 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.97 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.52 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.90 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.82 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.47 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.26 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.54 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.29 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 2.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.22 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.56 Batches/s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 19.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.89 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 2.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:01<00:00, 1.50 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.58 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 2.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.94 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.99 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.39 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.13 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.91 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.67 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.88 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.82 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.18 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 2.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.16 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.77 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 2.06 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 3.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.08 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.88 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.91 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 8.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.69 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.20 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.31 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.14 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.29 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.96 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.51 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.97 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.11 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.54 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.89 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.22 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.52 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.20 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.45 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.96 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.58 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.37 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.84 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.63 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 18.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.54 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.58 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.06 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.73 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 11.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.90 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.63 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.33 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.18 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.08 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.16 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.49 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.21 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.16 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.74 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.31 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.27 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.36 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.96 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.34 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.82 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.25 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.49 Batches/s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 11.02 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.27 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.53 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.87 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.84 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.74 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.36 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.73 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.39 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.39 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.61 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:01<00:00, 1.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.27 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.63 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.02 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.35 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.29 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.94 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.11 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.52 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.06 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.74 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.49 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.53 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.97 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.96 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.26 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 62.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 11.94 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.22 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 24.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.13 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.90 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.63 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.45 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.61 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.13 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.90 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.69 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.11 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.45 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.99 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.91 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.73 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.33 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.96 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.70 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.39 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.48 Batches/s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.67 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 10.14 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.77 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.58 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.76 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 29.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.61 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.24 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 27.26 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 10.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.08 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.73 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.32 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.06 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.50 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 11.30 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.33 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 26.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.02 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.44 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.26 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.77 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.21 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.89 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.69 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.14 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.32 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.02 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 23.44 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.37 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.34 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.02 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.48 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.65 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 58.54 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.18 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.82 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.51 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.67 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.99 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.89 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.22 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.67 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.63 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.73 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.18 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.61 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.97 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.13 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.74 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.41 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 26.84 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.11 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.73 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.87 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.94 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.00 Batches/s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.15 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.08 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.15 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.22 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.77 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.24 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.99 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.80 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.21 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 10.65 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.39 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.16 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.06 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.37 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.27 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.76 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.82 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.94 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 10.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.06 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.53 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.95 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.69 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.17 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.49 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.74 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.74 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.54 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 17.25 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 26.22 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.35 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.02 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.76 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.24 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.76 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 19.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.19 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.37 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.08 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.71 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.11 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.16 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.53 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.54 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.58 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.76 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.36 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.01 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 6.69 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.98 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.48 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.51 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.24 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.85 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.49 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.20 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:02<00:00, 1.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.15 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 1.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:01<00:00, 1.48 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 1.64 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:01<00:00, 1.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.97 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.26 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.21 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.25 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.89 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.23 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.93 Batches/s]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.50 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.18 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.62 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.49 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.52 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.59 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 17.37 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.15 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.11 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.89 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.04 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.90 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.88 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.65 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.39 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.30 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.87 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.75 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.42 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.12 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.70 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.50 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 12.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.30 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.21 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.45 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.94 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.15 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.91 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 3.46 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 2.29 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.68 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:01<00:00, 2.58 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.16 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.97 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 7.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.71 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.35 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.86 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 4.26 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.53 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.51 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.37 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 24.83 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 3.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.57 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.05 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 13.77 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.43 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.48 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.87 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.40 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.10 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.66 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 5.13 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.79 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 2/2 [00:00<00:00, 7.65 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 6.07 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 8.36 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.24 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.56 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.92 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 1.84 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 5.00 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 6.91 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.99 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.99 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.60 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 14.55 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.67 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 9.03 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 4.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.34 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 11.72 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 16.78 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 3/3 [00:00<00:00, 3.81 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 7/7 [00:01<00:00, 4.28 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 4.20 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 5.09 Batches/s]\n",
"Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 2.27 Batches/s]\n",
"05/19/2020 09:09:54 - INFO - haystack.finder - 57 out of 59 questions were correctly answered (96.61%).\n",
"05/19/2020 09:09:54 - INFO - haystack.finder - 0 questions could not be answered due to the retriever.\n",
"05/19/2020 09:09:54 - INFO - haystack.finder - 2 questions could not be answered due to the reader.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Retriever Recall in Finder: 1.0\n",
"Retriever Mean Avg Precision in Finder: 0.984934086629002\n",
"Reader Recall in Finder: 0.9661016949152542\n",
"Reader Mean Avg Precision in Finder: 0.44187516814635447\n",
"Reader Exact Match in Finder: 0.9661016949152542\n",
"Reader F1-Score in Finder: 0.9661016949152542\n"
]
}
],
"source": [
"# Evaluate combination of Reader and Retriever through Finder\n",
"finder_eval_results = finder.eval()\n",
"\n",
"print(\"Retriever Recall in Finder:\", finder_eval_results[\"retriever_recall\"])\n",
"print(\"Retriever Mean Avg Precision in Finder:\", finder_eval_results[\"retriever_map\"])\n",
"\n",
"# Reader is only evaluated with those questions, where the correct document is among the retrieved ones\n",
"print(\"Reader Recall in Finder:\", finder_eval_results[\"reader_recall\"])\n",
"print(\"Reader Mean Avg Precision in Finder:\", finder_eval_results[\"reader_map\"])\n",
"print(\"Reader Exact Match in Finder:\", finder_eval_results[\"reader_em\"])\n",
"print(\"Reader F1-Score in Finder:\", finder_eval_results[\"reader_f1\"])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"name": "haystack",
"language": "python",
"display_name": "haystack"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}