mirror of
https://github.com/deepset-ai/haystack.git
synced 2026-02-06 15:02:30 +00:00
170 lines
6.7 KiB
Plaintext
170 lines
6.7 KiB
Plaintext
---
|
||
title: "MistralChatGenerator"
|
||
id: mistralchatgenerator
|
||
slug: "/mistralchatgenerator"
|
||
description: "This component enables chat completion using Mistral’s text generation models."
|
||
---
|
||
|
||
# MistralChatGenerator
|
||
|
||
This component enables chat completion using Mistral’s text generation models.
|
||
|
||
<div className="key-value-table">
|
||
|
||
| | |
|
||
| --- | --- |
|
||
| **Most common position in a pipeline** | After a [ChatPromptBuilder](../builders/chatpromptbuilder.mdx) |
|
||
| **Mandatory init variables** | `api_key`: The Mistral API key. Can be set with `MISTRAL_API_KEY` env var. |
|
||
| **Mandatory run variables** | `messages` A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects |
|
||
| **Output variables** | `replies`: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects <br /> <br />`meta`: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on |
|
||
| **API reference** | [Mistral](/reference/integrations-mistral) |
|
||
| **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/mistral |
|
||
|
||
</div>
|
||
|
||
## Overview
|
||
|
||
This integration supports Mistral’s models provided through the generative endpoint. For a full list of available models, check out the [Mistral documentation](https://docs.mistral.ai/platform/endpoints/#generative-endpoints).
|
||
|
||
`MistralChatGenerator` needs a Mistral API key to work. You can write this key in:
|
||
|
||
- The `api_key` init parameter using [Secret API](../../concepts/secret-management.mdx)
|
||
- The `MISTRAL_API_KEY` environment variable (recommended)
|
||
|
||
Currently, available models are:
|
||
|
||
- `mistral-tiny` (default)
|
||
- `mistral-small`
|
||
- `mistral-medium`(soon to be deprecated)
|
||
- `mistral-large-latest`
|
||
- `codestral-latest`
|
||
|
||
This component needs a list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects to operate. `ChatMessage` is a data class that contains a message, a role (who generated the message, such as `user`, `assistant`, `system`, `function`), and optional metadata.
|
||
|
||
Refer to the [Mistral API documentation](https://docs.mistral.ai/api/#operation/createChatCompletion) for more details on the parameters supported by the Mistral API, which you can provide with `generation_kwargs` when running the component.
|
||
|
||
### Tool Support
|
||
|
||
`MistralChatGenerator` supports function calling through the `tools` parameter, which accepts flexible tool configurations:
|
||
|
||
- **A list of Tool objects**: Pass individual tools as a list
|
||
- **A single Toolset**: Pass an entire Toolset directly
|
||
- **Mixed Tools and Toolsets**: Combine multiple Toolsets with standalone tools in a single list
|
||
|
||
This allows you to organize related tools into logical groups while also including standalone tools as needed.
|
||
|
||
```python
|
||
from haystack.tools import Tool, Toolset
|
||
from haystack_integrations.components.generators.mistral import MistralChatGenerator
|
||
|
||
# Create individual tools
|
||
weather_tool = Tool(name="weather", description="Get weather info", ...)
|
||
news_tool = Tool(name="news", description="Get latest news", ...)
|
||
|
||
# Group related tools into a toolset
|
||
math_toolset = Toolset([add_tool, subtract_tool, multiply_tool])
|
||
|
||
# Pass mixed tools and toolsets to the generator
|
||
generator = MistralChatGenerator(
|
||
tools=[math_toolset, weather_tool, news_tool] # Mix of Toolset and Tool objects
|
||
)
|
||
```
|
||
|
||
For more details on working with tools, see the [Tool](../../tools/tool.mdx) and [Toolset](../../tools/toolset.mdx) documentation.
|
||
|
||
### Streaming
|
||
|
||
This Generator supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter.
|
||
|
||
## Usage
|
||
|
||
Install the `mistral-haystack` package to use the `MistralChatGenerator`:
|
||
|
||
```shell
|
||
pip install mistral-haystack
|
||
```
|
||
|
||
#### On its own
|
||
|
||
```python
|
||
from haystack_integrations.components.generators.mistral import MistralChatGenerator
|
||
from haystack.components.generators.utils import print_streaming_chunk
|
||
from haystack.dataclasses import ChatMessage
|
||
from haystack.utils import Secret
|
||
|
||
generator = MistralChatGenerator(api_key=Secret.from_env_var("MISTRAL_API_KEY"), streaming_callback=print_streaming_chunk)
|
||
message = ChatMessage.from_user("What's Natural Language Processing? Be brief.")
|
||
print(generator.run([message]))
|
||
```
|
||
|
||
With multimodal inputs:
|
||
|
||
```python
|
||
from haystack.dataclasses import ChatMessage, ImageContent
|
||
from haystack_integrations.components.generators.mistral import MistralChatGenerator
|
||
|
||
llm = MistralChatGenerator(model="pixtral-12b-2409")
|
||
|
||
image = ImageContent.from_file_path("apple.jpg")
|
||
user_message = ChatMessage.from_user(content_parts=[
|
||
"What does the image show? Max 5 words.",
|
||
image
|
||
])
|
||
|
||
response = llm.run([user_message])["replies"][0].text
|
||
print(response)
|
||
|
||
# Red apple on straw.
|
||
```
|
||
|
||
#### In a Pipeline
|
||
|
||
Below is an example RAG Pipeline where we answer questions based on the URL contents. We add the contents of the URL into our `messages` in the `ChatPromptBuilder` and generate an answer with the `MistralChatGenerator`.
|
||
|
||
```python
|
||
from haystack import Document
|
||
from haystack import Pipeline
|
||
from haystack.components.builders import ChatPromptBuilder
|
||
from haystack.components.generators.utils import print_streaming_chunk
|
||
from haystack.components.fetchers import LinkContentFetcher
|
||
from haystack.components.converters import HTMLToDocument
|
||
from haystack.dataclasses import ChatMessage
|
||
|
||
from haystack_integrations.components.generators.mistral import MistralChatGenerator
|
||
|
||
fetcher = LinkContentFetcher()
|
||
converter = HTMLToDocument()
|
||
prompt_builder = ChatPromptBuilder(variables=["documents"])
|
||
llm = MistralChatGenerator(streaming_callback=print_streaming_chunk, model='mistral-small')
|
||
|
||
message_template = """Answer the following question based on the contents of the article: {{query}}\n
|
||
Article: {{documents[0].content}} \n
|
||
"""
|
||
messages = [ChatMessage.from_user(message_template)]
|
||
|
||
rag_pipeline = Pipeline()
|
||
rag_pipeline.add_component(name="fetcher", instance=fetcher)
|
||
rag_pipeline.add_component(name="converter", instance=converter)
|
||
rag_pipeline.add_component("prompt_builder", prompt_builder)
|
||
rag_pipeline.add_component("llm", llm)
|
||
|
||
rag_pipeline.connect("fetcher.streams", "converter.sources")
|
||
rag_pipeline.connect("converter.documents", "prompt_builder.documents")
|
||
rag_pipeline.connect("prompt_builder.prompt", "llm.messages")
|
||
|
||
question = "What are the capabilities of Mixtral?"
|
||
|
||
result = rag_pipeline.run(
|
||
{
|
||
"fetcher": {"urls": ["https://mistral.ai/news/mixtral-of-experts"]},
|
||
"prompt_builder": {"template_variables": {"query": question}, "template": messages},
|
||
|
||
"llm": {"generation_kwargs": {"max_tokens": 165}},
|
||
},
|
||
)
|
||
```
|
||
|
||
## Additional References
|
||
|
||
🧑🍳 Cookbook: [Web QA with Mixtral-8x7B-Instruct-v0.1](https://haystack.deepset.ai/cookbook/mixtral-8x7b-for-web-qa)
|