* Refactoring to remove duplicate code when using OpenAI API
* Adding docstrings
* Fix mypy issue
* Moved retry mechanism to openai_request function in openai_utils
* Migrate OpenAI embedding encoder to use the openai_request util function.
* Adding docstrings.
* pylint import errors
* More pylint import errors
* Move construction of headers into openai_request and api_key as input variable.
* Made _openai_text_completion_tokenization_details so can be resued in PromptNode and OpenAIAnswerGenerator
* Add prompt truncation to the PromptNode.
* Removed commented out test.
* Bump version of tiktoken to 0.2.0 so we can use MODEL_TO_ENCODING to automatically determine correct tokenizer for the requested model
* Change one method back to public
* Fixed bug in token length truncation. Included answer length into truncation amount. Moved truncation higher up to PromptNode level.
* Pylint error
* Improved warning message
* Added _ensure_token_limit for HFLocalInvocationLayer. Had to remove max_length from base PromptModelInvocationLayer to ensure that max_length has a default value.
* Adding tests
* Expanded on doc strings
* Updated tests
* Update docstrings
* Update tests, and go back to how USE_TIKTOKEN was used before.
* Update haystack/nodes/prompt/prompt_node.py
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
* Update haystack/nodes/prompt/prompt_node.py
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
* Update haystack/nodes/prompt/prompt_node.py
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
* Update haystack/nodes/retriever/_openai_encoder.py
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
* Update haystack/utils/openai_utils.py
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
* Update haystack/utils/openai_utils.py
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
* Updated docstrings, and added integration marks
* Remove comment
* Update test
* Fix test
* Update test
* Updated openai_request function to work with the azure api
* Fixed error in _openai_encodery.py
---------
Co-authored-by: Agnieszka Marzec <97166305+agnieszka-m@users.noreply.github.com>
Co-authored-by: Vladimir Blagojevic <dovlex@gmail.com>
* Starting to implement first pass at run_batch
* Started to add _flatten_input function
* First pass at run_batch method.
* Fixed bug
* Adding tests for run_batch
* Update doc strings
* Pylint and mypy
* Pylint
* Fixing mypy
* Restructurig of run_batch tests
* Add minor lg updates
* Adding more tests
* Update dev comments and call static method differently
* Fixed the setting of output variable
* Set output_variable in __init__ of PromptNode
* Make a one-liner
---------
Co-authored-by: agnieszka-m <amarzec13@gmail.com>
* Update allowed models to be used with Prompt Node
* Added try except block around the config to skip over OpenAI models.
* Fixing tests
* Adding warning message
* Adding test for different HF models that could be used in prompt node
* fix: Add a verbose option to PromptNode to let users understand the prompts being used #2
* Add comments and refactoring todo note
* Fix logging-fstring-interpolation pylint
* Update haystack/nodes/prompt/prompt_node.py
Co-authored-by: Massimiliano Pippi <mpippi@gmail.com>
---------
Co-authored-by: Vladimir Blagojevic <dovlex@gmail.com>
Co-authored-by: Massimiliano Pippi <mpippi@gmail.com>