mirror of
https://github.com/deepset-ai/haystack.git
synced 2026-02-03 21:44:49 +00:00
115 lines
3.9 KiB
Plaintext
115 lines
3.9 KiB
Plaintext
---
|
|
title: "CohereGenerator"
|
|
id: coheregenerator
|
|
slug: "/coheregenerator"
|
|
description: "`CohereGenerator` enables text generation using Cohere's large language models (LLMs)."
|
|
---
|
|
|
|
# CohereGenerator
|
|
|
|
`CohereGenerator` enables text generation using Cohere's large language models (LLMs).
|
|
|
|
| | |
|
|
| --- | --- |
|
|
| **Most common position in a pipeline** | After a [`PromptBuilder`](../builders/promptbuilder.mdx) |
|
|
| **Mandatory init variables** | "api_key": The Cohere API key. Can be set with `COHERE_API_KEY` or `CO_API_KEY` env var. |
|
|
| **Mandatory run variables** | “prompt”: A string containing the prompt for the LLM |
|
|
| **Output variables** | “replies”: A list of strings with all the replies generated by the LLM <br /> <br />”meta”: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on |
|
|
| **API reference** | [Cohere](/reference/integrations-cohere) |
|
|
| **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/cohere |
|
|
|
|
This integration supports Cohere models such as `command`, `command-r` and `comman-r-plus`. Check out the most recent full list in [Cohere documentation](https://docs.cohere.com/reference/chat).
|
|
|
|
## Overview
|
|
|
|
`CohereGenerator` needs a Cohere API key to work. You can write this key in:
|
|
|
|
- The `api_key` init parameter using [Secret API](/docs/secret-management)
|
|
- The `COHERE_API_KEY` environment variable (recommended)
|
|
|
|
Then, the component needs a prompt to operate, but you can pass any text generation parameters directly to this component using the `generation_kwargs` parameter at initialization. For more details on the parameters supported by the Cohere API, refer to the [Cohere documentation](https://docs.cohere.com/reference/chat).
|
|
|
|
### Streaming
|
|
|
|
This Generator supports [streaming](/docs/choosing-the-right-generator#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter.
|
|
|
|
## Usage
|
|
|
|
You need to install `cohere-haystack` package to use the `CohereGenerator`:
|
|
|
|
```shell
|
|
pip install cohere-haystack
|
|
```
|
|
|
|
### On its own
|
|
|
|
Basic usage:
|
|
|
|
```python
|
|
from haystack_integrations.components.generators.cohere import CohereGenerator
|
|
|
|
client = CohereGenerator()
|
|
response = client.run("Briefly explain what NLP is in one sentence.")
|
|
print(response)
|
|
|
|
'meta': [{'finish_reason': 'COMPLETE'}]}
|
|
```
|
|
|
|
With streaming:
|
|
|
|
```python
|
|
from haystack_integrations.components.generators.cohere import CohereGenerator
|
|
|
|
client = CohereGenerator(streaming_callback=lambda chunk: print(chunk.content, end="", flush=True))
|
|
response = client.run("Briefly explain what NLP is in one sentence.")
|
|
print(response)
|
|
|
|
```
|
|
|
|
### In a pipeline
|
|
|
|
In a RAG pipeline:
|
|
|
|
```python
|
|
from haystack import Pipeline
|
|
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
|
|
from haystack.components.builders.prompt_builder import PromptBuilder
|
|
from haystack.document_stores.in_memory import InMemoryDocumentStore
|
|
from haystack_integrations.components.generators.cohere import CohereGenerator
|
|
from haystack import Document
|
|
|
|
docstore = InMemoryDocumentStore()
|
|
docstore.write_documents([Document(content="Rome is the capital of Italy"), Document(content="Paris is the capital of France")])
|
|
|
|
query = "What is the capital of France?"
|
|
|
|
template = """
|
|
Given the following information, answer the question.
|
|
|
|
Context:
|
|
{% for document in documents %}
|
|
{{ document.content }}
|
|
{% endfor %}
|
|
|
|
Question: {{ query }}?
|
|
"""
|
|
pipe = Pipeline()
|
|
|
|
pipe.add_component("retriever", InMemoryBM25Retriever(document_store=docstore))
|
|
pipe.add_component("prompt_builder", PromptBuilder(template=template))
|
|
pipe.add_component("llm", CohereGenerator())
|
|
pipe.connect("retriever", "prompt_builder.documents")
|
|
pipe.connect("prompt_builder", "llm")
|
|
|
|
res=pipe.run({
|
|
"prompt_builder": {
|
|
"query": query
|
|
},
|
|
"retriever": {
|
|
"query": query
|
|
}
|
|
})
|
|
|
|
print(res)
|
|
```
|