Add debugging example to tutorial (#1731)

* Add debugging example to tutorial

* Add latest docstring and tutorial changes

* Remove Objects suffix

* Add latest docstring and tutorial changes

* Revert "Remove Objects suffix"

This reverts commit 6681cb06510b080775994effe6a50bae42254be4.

* Revert unintentional commit

* Add third debugging option

* Add latest docstring and tutorial changes

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
This commit is contained in:
Branden Chan 2021-11-11 14:45:06 +01:00 committed by GitHub
parent 7059344d9e
commit 8082549663
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 108 additions and 1 deletions

View File

@ -329,6 +329,35 @@ print_answers(res_2)
We have also designed a set of nodes that can be used to evaluate the performance of a system.
Have a look at our [tutorial](https://haystack.deepset.ai/docs/latest/tutorial5md) to get hands on with the code and learn more about Evaluation Nodes!
## Debugging Pipelines
You can print out debug information from nodes in your pipelines in a few different ways.
```python
# 1) You can set the `debug` attribute of a given node.
es_retriever.debug = True
# 2) You can provide `debug` as a parameter when running your pipeline
result = p_classifier.run(
query="Who is the father of Arya Stark?",
params={
"ESRetriever": {
"debug": True
}
}
)
# 3) You can provide the `debug` paramter to all nodes in your pipeline
result = p_classifier.run(
query="Who is the father of Arya Stark?",
params={
"debug": True
}
)
result["_debug"]
```
## YAML Configs

View File

@ -588,7 +588,7 @@
"## Evaluation Nodes\n",
"\n",
"We have also designed a set of nodes that can be used to evaluate the performance of a system.\n",
"Have a look at our [tutorial](https://haystack.deepset.ai/docs/latest/tutorial5md) to get hands on with the code and learn more about Evaluation Nodes!\n"
"Have a look at our [tutorial](https://haystack.deepset.ai/docs/latest/tutorial5md) to get hands on with the code and learn more about Evaluation Nodes!"
],
"metadata": {
"collapsed": false,
@ -597,6 +597,55 @@
}
}
},
{
"cell_type": "markdown",
"source": [
"## Debugging Pipelines\n",
"\n",
"You can print out debug information from nodes in your pipelines in a few different ways."
],
"metadata": {
"collapsed": false,
"pycharm": {
"name": "#%% md\n"
}
}
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [
"# 1) You can set the `debug` attribute of a given node.\n",
"es_retriever.debug = True\n",
"\n",
"# 2) You can provide `debug` as a parameter when running your pipeline\n",
"result = p_classifier.run(\n",
" query=\"Who is the father of Arya Stark?\",\n",
" params={\n",
" \"ESRetriever\": {\n",
" \"debug\": True\n",
" }\n",
" }\n",
")\n",
"\n",
"# 3) You can provide the `debug` paramter to all nodes in your pipeline\n",
"result = p_classifier.run(\n",
" query=\"Who is the father of Arya Stark?\",\n",
" params={\n",
" \"debug\": True\n",
" }\n",
")\n",
"\n",
"result[\"_debug\"]"
],
"metadata": {
"collapsed": false,
"pycharm": {
"name": "#%%\n"
}
}
},
{
"cell_type": "markdown",
"source": [

View File

@ -201,6 +201,35 @@ def tutorial11_pipelines():
print_answers(res_2, details="minimum")
print("#######################")
print("# Debugging Pipelines #")
print("#######################")
# You can print out debug information from nodes in your pipelines in a few different ways.
# 1) You can set the `debug` attribute of a given node.
es_retriever.debug = True
# 2) You can provide `debug` as a parameter when running your pipeline
result = p_classifier.run(
query="Who is the father of Arya Stark?",
params={
"ESRetriever": {
"debug": True
}
}
)
# 3) You can provide the `debug` paramter to all nodes in your pipeline
result = p_classifier.run(
query="Who is the father of Arya Stark?",
params={
"debug": True
}
)
pprint(result["_debug"])
if __name__ == "__main__":
tutorial11_pipelines()