mirror of
https://github.com/deepset-ai/haystack.git
synced 2026-02-05 06:23:42 +00:00
* Update documentation and remove unused assets. Enhanced the 'agents' and 'components' sections with clearer descriptions and examples. Removed obsolete images and updated links for better navigation. Adjusted formatting for consistency across various documentation pages. * remove dependency * address comments * delete more empty pages * broken link * unduplicate headings * alphabetical components nav
199 lines
7.8 KiB
Plaintext
199 lines
7.8 KiB
Plaintext
---
|
||
title: "ToolInvoker"
|
||
id: toolinvoker
|
||
slug: "/toolinvoker"
|
||
description: "This component is designed to execute tool calls prepared by language models. It acts as a bridge between the language model's output and the actual execution of functions or tools that perform specific tasks."
|
||
---
|
||
|
||
# ToolInvoker
|
||
|
||
This component is designed to execute tool calls prepared by language models. It acts as a bridge between the language model's output and the actual execution of functions or tools that perform specific tasks.
|
||
|
||
| | |
|
||
| --- | --- |
|
||
| **Most common position in a pipeline** | After a Chat Generator |
|
||
| **Mandatory init variables** | “tools”: A list of [`Tools`](../../tools/tool.mdx) that can be invoked |
|
||
| **Mandatory run variables** | “messages”: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects from a Chat Generator containing tool calls |
|
||
| **Output variables** | “tool_messages”: A list of `ChatMessage` objects with tool role. Each `ChatMessage` objects wraps the result of a tool invocation. |
|
||
| **API reference** | [Tools](/reference/tools-api) |
|
||
| **GitHub link** | https://github.com/deepset-ai/haystack/blob/main/haystack/components/tools/tool_invoker.py |
|
||
|
||
## Overview
|
||
|
||
A `ToolInvoker` is a component that processes `ChatMessage` objects containing tool calls. It invokes the corresponding tools and returns the results as a list of `ChatMessage` objects. Each tool is defined with a name, description, parameters, and a function that performs the task. The `ToolInvoker` manages these tools and handles the invocation process.
|
||
|
||
You can pass multiple tools to the `ToolInvoker` component, and it will automatically choose the right tool to call based on tool calls produced by a Language Model.
|
||
|
||
The `ToolInvoker` has two additionally helpful parameters:
|
||
|
||
- `convert_result_to_json_string`: Use `json.dumps` (when True) or `str` (when False) to convert the result into a string.
|
||
- `raise_on_failure`: If True, it will raise an exception in case of errors. If False, it will return a `ChatMessage` object with `error=True` and a description of the error in `result`. Use this, for example, when you want to keep the Language Model running in a loop and fixing its errors.
|
||
|
||
:::info
|
||
ChatMessage and Tool Data Classes
|
||
|
||
Follow the links to learn more about [ChatMessage](../../concepts/data-classes/chatmessage.mdx) and [Tool](../../tools/tool.mdx) data classes.
|
||
:::
|
||
|
||
## Usage
|
||
|
||
### On its own
|
||
|
||
```python
|
||
from haystack.dataclasses import ChatMessage, ToolCall
|
||
from haystack.components.tools import ToolInvoker
|
||
from haystack.tools import Tool
|
||
|
||
## Tool definition
|
||
def dummy_weather_function(city: str):
|
||
return f"The weather in {city} is 20 degrees."
|
||
parameters = {"type": "object",
|
||
"properties": {"city": {"type": "string"}},
|
||
"required": ["city"]}
|
||
tool = Tool(name="weather_tool",
|
||
description="A tool to get the weather",
|
||
function=dummy_weather_function,
|
||
parameters=parameters)
|
||
|
||
## Usually, the ChatMessage with tool_calls is generated by a Language Model
|
||
## Here, we create it manually for demonstration purposes
|
||
tool_call = ToolCall(
|
||
tool_name="weather_tool",
|
||
arguments={"city": "Berlin"}
|
||
)
|
||
message = ChatMessage.from_assistant(tool_calls=[tool_call])
|
||
|
||
## ToolInvoker initialization and run
|
||
invoker = ToolInvoker(tools=[tool])
|
||
result = invoker.run(messages=[message])
|
||
|
||
print(result)
|
||
```
|
||
|
||
```
|
||
>> {
|
||
>> 'tool_messages': [
|
||
>> ChatMessage(
|
||
>> _role=<ChatRole.TOOL: 'tool'>,
|
||
>> _content=[
|
||
>> ToolCallResult(
|
||
>> result='"The weather in Berlin is 20 degrees."',
|
||
>> origin=ToolCall(
|
||
>> tool_name='weather_tool',
|
||
>> arguments={'city': 'Berlin'},
|
||
>> id=None
|
||
>> )
|
||
>> )
|
||
>> ],
|
||
>> _meta={}
|
||
>> )
|
||
>> ]
|
||
>> }
|
||
```
|
||
|
||
### In a pipeline
|
||
|
||
The following code snippet shows how to process a user query about the weather. First, we define a `Tool` for fetching weather data, then we initialize a `ToolInvoker` to execute this tool, while using an `OpenAIChatGenerator` to generate responses. A `ConditionalRouter` is used in this pipeline to route messages based on whether they contain tool calls. The pipeline connects these components, processes a user message asking for the weather in Berlin, and outputs the result.
|
||
|
||
```python
|
||
from haystack.dataclasses import ChatMessage
|
||
from haystack.components.tools import ToolInvoker
|
||
from haystack.components.generators.chat import OpenAIChatGenerator
|
||
from haystack.components.routers import ConditionalRouter
|
||
from haystack.tools import Tool
|
||
from haystack import Pipeline
|
||
from typing import List # Ensure List is imported
|
||
|
||
## Define a dummy weather tool
|
||
import random
|
||
|
||
def dummy_weather(location: str):
|
||
return {"temp": f"{random.randint(-10, 40)} °C",
|
||
"humidity": f"{random.randint(0, 100)}%"}
|
||
|
||
weather_tool = Tool(
|
||
name="weather",
|
||
description="A tool to get the weather",
|
||
function=dummy_weather,
|
||
parameters={
|
||
"type": "object",
|
||
"properties": {"location": {"type": "string"}},
|
||
"required": ["location"],
|
||
},
|
||
)
|
||
|
||
## Initialize the ToolInvoker with the weather tool
|
||
tool_invoker = ToolInvoker(tools=[weather_tool])
|
||
|
||
## Initialize the ChatGenerator
|
||
chat_generator = OpenAIChatGenerator(model="gpt-4o-mini", tools=[weather_tool])
|
||
|
||
## Define routing conditions
|
||
routes = [
|
||
{
|
||
"condition": "{{replies[0].tool_calls | length > 0}}",
|
||
"output": "{{replies}}",
|
||
"output_name": "there_are_tool_calls",
|
||
"output_type": List[ChatMessage], # Use direct type
|
||
},
|
||
{
|
||
"condition": "{{replies[0].tool_calls | length == 0}}",
|
||
"output": "{{replies}}",
|
||
"output_name": "final_replies",
|
||
"output_type": List[ChatMessage], # Use direct type
|
||
},
|
||
]
|
||
|
||
## Initialize the ConditionalRouter
|
||
router = ConditionalRouter(routes, unsafe=True)
|
||
|
||
## Create the pipeline
|
||
pipeline = Pipeline()
|
||
pipeline.add_component("generator", chat_generator)
|
||
pipeline.add_component("router", router)
|
||
pipeline.add_component("tool_invoker", tool_invoker)
|
||
|
||
## Connect components
|
||
pipeline.connect("generator.replies", "router")
|
||
pipeline.connect("router.there_are_tool_calls", "tool_invoker.messages") # Correct connection
|
||
|
||
## Example user message
|
||
user_message = ChatMessage.from_user("What is the weather in Berlin?")
|
||
|
||
## Run the pipeline
|
||
result = pipeline.run({"messages": [user_message]})
|
||
|
||
## Print the result
|
||
print(result)
|
||
```
|
||
|
||
```
|
||
{
|
||
"tool_invoker":{
|
||
"tool_messages":[
|
||
"ChatMessage(_role=<ChatRole.TOOL":"tool"">",
|
||
"_content="[
|
||
"ToolCallResult(result=""{'temp': '33 °C', 'humidity': '79%'}",
|
||
"origin=ToolCall(tool_name=""weather",
|
||
"arguments="{
|
||
"location":"Berlin"
|
||
},
|
||
"id=""call_pUVl8Cycssk1dtgMWNT1T9eT"")",
|
||
"error=False)"
|
||
],
|
||
"_name=None",
|
||
"_meta="{
|
||
|
||
}")"
|
||
]
|
||
}
|
||
}
|
||
```
|
||
|
||
## Additional References
|
||
|
||
🧑🍳 Cookbooks:
|
||
|
||
- [Define & Run Tools](https://haystack.deepset.ai/cookbook/tools_support)
|
||
- [Newsletter Sending Agent with Haystack Tools](https://haystack.deepset.ai/cookbook/newsletter-agent)
|
||
- [Create a Swarm of Agents](https://haystack.deepset.ai/cookbook/swarm) |