mirror of
https://github.com/deepset-ai/haystack.git
synced 2026-01-06 20:17:14 +00:00
135 lines
5.4 KiB
Plaintext
135 lines
5.4 KiB
Plaintext
---
|
|
title: "CohereChatGenerator"
|
|
id: coherechatgenerator
|
|
slug: "/coherechatgenerator"
|
|
description: "CohereChatGenerator enables chat completions using Cohere's large language models (LLMs)."
|
|
---
|
|
|
|
# CohereChatGenerator
|
|
|
|
CohereChatGenerator enables chat completions using Cohere's large language models (LLMs).
|
|
|
|
<div className="key-value-table">
|
|
|
|
| | |
|
|
| --- | --- |
|
|
| **Most common position in a pipeline** | After a [ChatPromptBuilder](../builders/chatpromptbuilder.mdx) |
|
|
| **Mandatory init variables** | `api_key`: The Cohere API key. Can be set with `COHERE_API_KEY` or `CO_API_KEY` env var. |
|
|
| **Mandatory run variables** | `messages` A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects |
|
|
| **Output variables** | `replies`: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects <br /> <br />`meta`: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on |
|
|
| **API reference** | [Cohere](/reference/integrations-cohere) |
|
|
| **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/cohere |
|
|
|
|
</div>
|
|
|
|
This integration supports Cohere `chat` models such as `command`,`command-r` and `comman-r-plus`. Check out the most recent full list in [Cohere documentation](https://docs.cohere.com/reference/chat).
|
|
|
|
## Overview
|
|
|
|
`CohereChatGenerator` needs a Cohere API key to work. You can set this key in:
|
|
|
|
- The `api_key` init parameter using [Secret API](../../concepts/secret-management.mdx)
|
|
- The `COHERE_API_KEY` environment variable (recommended)
|
|
|
|
Then, the component needs a prompt to operate, but you can pass any text generation parameters valid for the `Co.chat` method directly to this component using the `generation_kwargs` parameter, both at initialization and to `run()` method. For more details on the parameters supported by the Cohere API, refer to the [Cohere documentation](https://docs.cohere.com/reference/chat).
|
|
|
|
Finally, the component needs a list of `ChatMessage` objects to operate. `ChatMessage` is a data class that contains a message, a role (who generated the message, such as `user`, `assistant`, `system`, `function`), and optional metadata.
|
|
|
|
### Tool Support
|
|
|
|
`CohereChatGenerator` supports function calling through the `tools` parameter, which accepts flexible tool configurations:
|
|
|
|
- **A list of Tool objects**: Pass individual tools as a list
|
|
- **A single Toolset**: Pass an entire Toolset directly
|
|
- **Mixed Tools and Toolsets**: Combine multiple Toolsets with standalone tools in a single list
|
|
|
|
This allows you to organize related tools into logical groups while also including standalone tools as needed.
|
|
|
|
```python
|
|
from haystack.tools import Tool, Toolset
|
|
from haystack_integrations.components.generators.cohere import CohereChatGenerator
|
|
|
|
# Create individual tools
|
|
weather_tool = Tool(name="weather", description="Get weather info", ...)
|
|
news_tool = Tool(name="news", description="Get latest news", ...)
|
|
|
|
# Group related tools into a toolset
|
|
math_toolset = Toolset([add_tool, subtract_tool, multiply_tool])
|
|
|
|
# Pass mixed tools and toolsets to the generator
|
|
generator = CohereChatGenerator(
|
|
tools=[math_toolset, weather_tool, news_tool] # Mix of Toolset and Tool objects
|
|
)
|
|
```
|
|
|
|
For more details on working with tools, see the [Tool](../../tools/tool.mdx) and [Toolset](../../tools/toolset.mdx) documentation.
|
|
|
|
### Streaming
|
|
|
|
This Generator supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter.
|
|
|
|
## Usage
|
|
|
|
You need to install `cohere-haystack` package to use the `CohereChatGenerator`:
|
|
|
|
```shell
|
|
pip install cohere-haystack
|
|
```
|
|
|
|
#### On its own
|
|
|
|
```python
|
|
from haystack_integrations.components.generators.cohere import CohereChatGenerator
|
|
from haystack.dataclasses import ChatMessage
|
|
|
|
generator = CohereChatGenerator()
|
|
message = ChatMessage.from_user("What's Natural Language Processing? Be brief.")
|
|
print(generator.run([message]))
|
|
```
|
|
|
|
With multimodal inputs:
|
|
|
|
```python
|
|
from haystack.dataclasses import ChatMessage, ImageContent
|
|
from haystack_integrations.components.generators.cohere import CohereChatGenerator
|
|
|
|
# Use a multimodal model like Command A Vision
|
|
llm = CohereChatGenerator(model="command-a-vision-07-2025")
|
|
|
|
image = ImageContent.from_file_path("apple.jpg")
|
|
user_message = ChatMessage.from_user(content_parts=[
|
|
"What does the image show? Max 5 words.",
|
|
image
|
|
])
|
|
|
|
response = llm.run([user_message])["replies"][0].text
|
|
print(response)
|
|
|
|
# Red apple on straw.
|
|
```
|
|
|
|
#### In a Pipeline
|
|
|
|
You can also use `CohereChatGenerator` to use cohere chat models in your pipeline.
|
|
|
|
```python
|
|
from haystack import Pipeline
|
|
from haystack.components.builders import ChatPromptBuilder
|
|
from haystack.dataclasses import ChatMessage
|
|
from haystack_integrations.components.generators.cohere import CohereChatGenerator
|
|
from haystack.utils import Secret
|
|
|
|
pipe = Pipeline()
|
|
pipe.add_component("prompt_builder", ChatPromptBuilder())
|
|
pipe.add_component("llm", CohereChatGenerator())
|
|
pipe.connect("prompt_builder", "llm")
|
|
|
|
country = "Germany"
|
|
system_message = ChatMessage.from_system("You are an assistant giving out valuable information to language learners.")
|
|
messages = [system_message, ChatMessage.from_user("What's the official language of {{ country }}?")]
|
|
|
|
res = pipe.run(data={"prompt_builder": {"template_variables": {"country": country}, "template": messages}})
|
|
print(res)
|
|
|
|
```
|