mirror of
https://github.com/deepset-ai/haystack.git
synced 2026-02-06 06:52:53 +00:00
* Update documentation and remove unused assets. Enhanced the 'agents' and 'components' sections with clearer descriptions and examples. Removed obsolete images and updated links for better navigation. Adjusted formatting for consistency across various documentation pages. * remove dependency * address comments * delete more empty pages * broken link * unduplicate headings * alphabetical components nav
106 lines
4.6 KiB
Plaintext
106 lines
4.6 KiB
Plaintext
---
|
||
title: "OpenRouterChatGenerator"
|
||
id: openrouterchatgenerator
|
||
slug: "/openrouterchatgenerator"
|
||
description: "This component enables chat completion with any model hosted on [OpenRouter](https://openrouter.ai/)."
|
||
---
|
||
|
||
# OpenRouterChatGenerator
|
||
|
||
This component enables chat completion with any model hosted on [OpenRouter](https://openrouter.ai/).
|
||
|
||
| | |
|
||
| --- | --- |
|
||
| **Most common position in a pipeline** | After a [ChatPromptBuilder](../builders/chatpromptbuilder.mdx) |
|
||
| **Mandatory init variables** | “api_key”: An OpenRouter API key. Can be set with `OPENROUTER_API_KEY` env variable or passed to `init()` method. |
|
||
| **Mandatory run variables** | “messages:” A list of [ChatMessage](../../concepts/data-classes/chatmessage.mdx) objects |
|
||
| **Output variables** | “replies”: A list of [ChatMessage](../../concepts/data-classes/chatmessage.mdx) objects |
|
||
| **API reference** | [OpenRouter](/reference/integrations-openrouter) |
|
||
| **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/openrouter |
|
||
|
||
## Overview
|
||
|
||
The `OpenRouterChatGenerator` enables you to use models from multiple providers (such as `openai/gpt-4o`, `anthropic/claude-3.5-sonnet`, and others) by making chat completion calls to the [OpenRouter API](https://openrouter.ai/docs/quickstart).
|
||
|
||
This generator also supports OpenRouter-specific features such as:
|
||
|
||
- Provider routing and model fallback that are configurable with the `generation_kwargs` parameter during initialization or runtime.
|
||
- Custom HTTP headers that can be supplied using the `extra_headers` parameter.
|
||
|
||
This component uses the same `ChatMessage` format as other Haystack Chat Generators for structured input and output. For more information, see the [ChatMessage documentation](../../concepts/data-classes/chatmessage.mdx).
|
||
|
||
It is also fully compatible with Haystack [Tools](../../tools/tool.mdx) and [Toolsets](../../tools/toolset.mdx) that allow function-calling capabilities with supported models.
|
||
|
||
### Initialization
|
||
|
||
To use this integration, you must have an active OpenRouter subscription with sufficient credits and an API key. You can provide it with the `OPENROUTER_API_KEY` environment variable or by using a [Secret](../../concepts/secret-management.mdx).
|
||
|
||
Then, install the `openrouter-haystack` integration:
|
||
|
||
```shell
|
||
pip install openrouter-haystack
|
||
```
|
||
|
||
### Streaming
|
||
|
||
`OpenRouterChatGenerator` supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) responses from the LLM, allowing tokens to be emitted as they are generated. To enable streaming, pass a callable to the `streaming_callback` parameter during initialization.
|
||
|
||
## Usage
|
||
|
||
### On its own
|
||
|
||
```python
|
||
from haystack.dataclasses import ChatMessage
|
||
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
|
||
|
||
client = OpenRouterChatGenerator()
|
||
response = client.run(
|
||
[ChatMessage.from_user("What are Agentic Pipelines? Be brief.")]
|
||
)
|
||
print(response["replies"][0].text)
|
||
```
|
||
|
||
With streaming and model routing:
|
||
|
||
```python
|
||
from haystack.dataclasses import ChatMessage
|
||
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
|
||
|
||
client = OpenRouterChatGenerator(model="openrouter/auto",
|
||
streaming_callback=lambda chunk: print(chunk.content, end="", flush=True))
|
||
|
||
response = client.run(
|
||
[ChatMessage.from_user("What are Agentic Pipelines? Be brief.")]
|
||
)
|
||
|
||
## check the model used for the response
|
||
print("\n\n Model used: ", response["replies"][0].meta["model"])
|
||
```
|
||
|
||
### In a pipeline
|
||
|
||
```python
|
||
from haystack import Pipeline
|
||
from haystack.components.builders import ChatPromptBuilder
|
||
from haystack.dataclasses import ChatMessage
|
||
from haystack_integrations.components.generators.openrouter import OpenRouterChatGenerator
|
||
|
||
prompt_builder = ChatPromptBuilder()
|
||
llm = OpenRouterChatGenerator(model="openai/gpt-4o-mini")
|
||
|
||
pipe = Pipeline()
|
||
pipe.add_component("builder", prompt_builder)
|
||
pipe.add_component("llm", llm)
|
||
pipe.connect("builder.prompt", "llm.messages")
|
||
|
||
messages = [
|
||
ChatMessage.from_system("Give brief answers."),
|
||
ChatMessage.from_user("Tell me about {{city}}")
|
||
]
|
||
|
||
response = pipe.run(
|
||
data={"builder": {"template": messages,
|
||
"template_variables": {"city": "Berlin"}}}
|
||
)
|
||
print(response)
|
||
``` |