* Starting to implement first pass at run_batch
* Started to add _flatten_input function
* First pass at run_batch method.
* Fixed bug
* Adding tests for run_batch
* Update doc strings
* Pylint and mypy
* Pylint
* Fixing mypy
* Restructurig of run_batch tests
* Add minor lg updates
* Adding more tests
* Update dev comments and call static method differently
* Fixed the setting of output variable
* Set output_variable in __init__ of PromptNode
* Make a one-liner
---------
Co-authored-by: agnieszka-m <amarzec13@gmail.com>
* Update allowed models to be used with Prompt Node
* Added try except block around the config to skip over OpenAI models.
* Fixing tests
* Adding warning message
* Adding test for different HF models that could be used in prompt node
* fix: Add a verbose option to PromptNode to let users understand the prompts being used #2
* Add comments and refactoring todo note
* Fix logging-fstring-interpolation pylint
* Update haystack/nodes/prompt/prompt_node.py
Co-authored-by: Massimiliano Pippi <mpippi@gmail.com>
---------
Co-authored-by: Vladimir Blagojevic <dovlex@gmail.com>
Co-authored-by: Massimiliano Pippi <mpippi@gmail.com>