Docs: Update OpenAIGen docstrings and add missing headers (#8105)

* update docstrings

* Update haystack/components/generators/openai.py

Co-authored-by: Daria Fokina <daria.fokina@deepset.ai>

---------

Co-authored-by: Daria Fokina <daria.fokina@deepset.ai>
This commit is contained in:
Agnieszka Marzec 2024-07-30 11:06:17 +02:00 committed by GitHub
parent 92e2377eff
commit e8598befb6
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 14 additions and 18 deletions

View File

@ -18,25 +18,21 @@ logger = logging.getLogger(__name__)
@component
class OpenAIGenerator:
"""
Text generation component using OpenAI's large language models (LLMs).
Generates text using OpenAI's large language models (LLMs).
Enables text generation using OpenAI's large language models (LLMs). It supports gpt-4 and gpt-3.5-turbo
family of models.
It works with the gpt-4 and gpt-3.5-turbo models and supports streaming responses
from OpenAI API. It uses strings as input and output.
Users can pass any text generation parameters valid for the `openai.ChatCompletion.create` method
directly to this component via the `**generation_kwargs` parameter in __init__ or the `**generation_kwargs`
parameter in `run` method.
You can customize how the text is generated by passing parameters to the
OpenAI API. Use the `**generation_kwargs` argument when you initialize
the component or when you run it. Any parameter that works with
`openai.ChatCompletion.create` will work here too.
For more details on the parameters supported by the OpenAI API, refer to the OpenAI
[documentation](https://platform.openai.com/docs/api-reference/chat).
Key Features and Compatibility:
- Primary Compatibility: Designed to work seamlessly with gpt-4, gpt-3.5-turbo family of models.
- Streaming Support: Supports streaming responses from the OpenAI API.
- Customizability: Supports all parameters supported by the OpenAI API.
For details on OpenAI API parameters, see
[OpenAI documentation](https://platform.openai.com/docs/api-reference/chat).
Input and Output Format:
- String Format: This component uses the strings for both input and output.
### Usage example
```python
from haystack.components.generators import OpenAIGenerator
@ -65,12 +61,12 @@ class OpenAIGenerator:
max_retries: Optional[int] = None,
):
"""
Creates an instance of OpenAIGenerator. Unless specified otherwise in the `model`, OpenAI's GPT-3.5 is used.
Creates an instance of OpenAIGenerator. Unless specified otherwise in `model`, uses OpenAI's GPT-3.5.
By setting the 'OPENAI_TIMEOUT' and 'OPENAI_MAX_RETRIES' you can change the timeout and max_retries parameters
in the OpenAI client.
:param api_key: The OpenAI API key.
:param api_key: The OpenAI API key to connect to OpenAI.
:param model: The name of the model to use.
:param streaming_callback: A callback function that is called when a new token is received from the stream.
The callback function accepts StreamingChunk as an argument.

View File

@ -20,7 +20,7 @@ class InMemoryEmbeddingRetriever:
In indexing pipelines, use a DocumentEmbedder to embed documents.
In query pipelines, use a TextEmbedder to embed queries and send them to the retriever.
Usage example:
### Usage example
```python
from haystack import Document
from haystack.components.embedders import SentenceTransformersDocumentEmbedder, SentenceTransformersTextEmbedder

View File

@ -16,7 +16,7 @@ class DocumentWriter:
"""
Writes documents to a DocumentStore.
Usage example:
### Usage example
```python
from haystack import Document
from haystack.components.writers import DocumentWriter