Haystack Bot a471fbfebe
Promote unstable docs for Haystack 2.21 (#10204)
Co-authored-by: vblagoje <458335+vblagoje@users.noreply.github.com>
2025-12-08 20:09:00 +01:00

158 lines
6.1 KiB
Plaintext
Raw Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: "OpenRouterChatGenerator"
id: openrouterchatgenerator
slug: "/openrouterchatgenerator"
description: "This component enables chat completion with any model hosted on [OpenRouter](https://openrouter.ai/)."
---
# OpenRouterChatGenerator
This component enables chat completion with any model hosted on [OpenRouter](https://openrouter.ai/).
<div className="key-value-table">
| | |
| --- | --- |
| **Most common position in a pipeline** | After a [ChatPromptBuilder](../builders/chatpromptbuilder.mdx) |
| **Mandatory init variables** | `api_key`: An OpenRouter API key. Can be set with `OPENROUTER_API_KEY` env variable or passed to `init()` method. |
| **Mandatory run variables** | `messages`: A list of [ChatMessage](../../concepts/data-classes/chatmessage.mdx) objects |
| **Output variables** | `replies`: A list of [ChatMessage](../../concepts/data-classes/chatmessage.mdx) objects |
| **API reference** | [OpenRouter](/reference/integrations-openrouter) |
| **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/openrouter |
</div>
## Overview
The `OpenRouterChatGenerator` enables you to use models from multiple providers (such as `openai/gpt-4o`, `anthropic/claude-3.5-sonnet`, and others) by making chat completion calls to the [OpenRouter API](https://openrouter.ai/docs/quickstart).
This generator also supports OpenRouter-specific features such as:
- Provider routing and model fallback that are configurable with the `generation_kwargs` parameter during initialization or runtime.
- Custom HTTP headers that can be supplied using the `extra_headers` parameter.
This component uses the same `ChatMessage` format as other Haystack Chat Generators for structured input and output. For more information, see the [ChatMessage documentation](../../concepts/data-classes/chatmessage.mdx).
### Tool Support
`OpenRouterChatGenerator` supports function calling through the `tools` parameter, which accepts flexible tool configurations:
- **A list of Tool objects**: Pass individual tools as a list
- **A single Toolset**: Pass an entire Toolset directly
- **Mixed Tools and Toolsets**: Combine multiple Toolsets with standalone tools in a single list
This allows you to organize related tools into logical groups while also including standalone tools as needed.
```python
from haystack.tools import Tool, Toolset
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
# Create individual tools
weather_tool = Tool(name="weather", description="Get weather info", ...)
news_tool = Tool(name="news", description="Get latest news", ...)
# Group related tools into a toolset
math_toolset = Toolset([add_tool, subtract_tool, multiply_tool])
# Pass mixed tools and toolsets to the generator
generator = OpenRouterChatGenerator(
tools=[math_toolset, weather_tool, news_tool] # Mix of Toolset and Tool objects
)
```
For more details on working with tools, see the [Tool](../../tools/tool.mdx) and [Toolset](../../tools/toolset.mdx) documentation.
### Initialization
To use this integration, you must have an active OpenRouter subscription with sufficient credits and an API key. You can provide it with the `OPENROUTER_API_KEY` environment variable or by using a [Secret](../../concepts/secret-management.mdx).
Then, install the `openrouter-haystack` integration:
```shell
pip install openrouter-haystack
```
### Streaming
`OpenRouterChatGenerator` supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) responses from the LLM, allowing tokens to be emitted as they are generated. To enable streaming, pass a callable to the `streaming_callback` parameter during initialization.
## Usage
### On its own
```python
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
client = OpenRouterChatGenerator()
response = client.run(
[ChatMessage.from_user("What are Agentic Pipelines? Be brief.")]
)
print(response["replies"][0].text)
```
With streaming and model routing:
```python
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
client = OpenRouterChatGenerator(model="openrouter/auto",
streaming_callback=lambda chunk: print(chunk.content, end="", flush=True))
response = client.run(
[ChatMessage.from_user("What are Agentic Pipelines? Be brief.")]
)
## check the model used for the response
print("\n\n Model used: ", response["replies"][0].meta["model"])
```
With multimodal inputs:
```python
from haystack.dataclasses import ChatMessage, ImageContent
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
llm = OpenRouterChatGenerator(model="anthropic/claude-3-5-sonnet")
image = ImageContent.from_file_path("apple.jpg")
user_message = ChatMessage.from_user(content_parts=[
"What does the image show? Max 5 words.",
image
])
response = llm.run([user_message])["replies"][0].text
print(response)
# Red apple on straw.
```
### In a pipeline
```python
from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
prompt_builder = ChatPromptBuilder()
llm = OpenRouterChatGenerator(model="openai/gpt-4o-mini")
pipe = Pipeline()
pipe.add_component("builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("builder.prompt", "llm.messages")
messages = [
ChatMessage.from_system("Give brief answers."),
ChatMessage.from_user("Tell me about {{city}}")
]
response = pipe.run(
data={"builder": {"template": messages,
"template_variables": {"city": "Berlin"}}}
)
print(response)
```