## Why are these changes needed?
- Add return_value_as_string for formating result from MCP tool
## Related issue number
- Opened Issue on #6368
## Checks
- [x] I've included any doc changes needed for
<https://microsoft.github.io/autogen/>. See
<https://github.com/microsoft/autogen/blob/main/CONTRIBUTING.md> to
build and test documentation locally.
- [x] I've added tests (if relevant) corresponding to the changes
introduced in this PR.
- [x] I've made sure all auto checks have passed.
---------
Co-authored-by: Eric Zhu <ekzhu@users.noreply.github.com>
This PR introduces `WorkBench`.
A workbench provides a group of tools that share the same resource and
state. For example, `McpWorkbench` provides the underlying tools on the
MCP server. A workbench allows tools to be managed together and abstract
away the lifecycle of individual tools under a single entity. This makes
it possible to create agents with stateful tools from serializable
configuration (component configs), and it also supports dynamic tools:
tools change after each execution.
Here is how a workbench may be used with AssistantAgent (not included in
this PR):
```python
workbench = McpWorkbench(server_params)
agent = AssistantAgent("assistant", tools=workbench)
result = await agent.run(task="do task...")
```
TODOs:
1. In a subsequent PR, update `AssistantAgent` to use workbench as an
alternative in the `tools` parameter. Use `StaticWorkbench` to manage
individual tools.
2. In another PR, add documentation on workbench.
---------
Co-authored-by: EeS <chiyoung.song@motov.co.kr>
Co-authored-by: Minh Đăng <74671798+perfogic@users.noreply.github.com>
Resolves#6232, #6198
This PR introduces an optional parameter `session` to `mcp_server_tools`
to support reuse of the same session.
```python
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import TextMentionTermination
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.mcp import StdioServerParams, create_mcp_server_session, mcp_server_tools
async def main() -> None:
model_client = OpenAIChatCompletionClient(model="gpt-4o", parallel_tool_calls=False) # type: ignore
params = StdioServerParams(
command="npx",
args=["@playwright/mcp@latest"],
read_timeout_seconds=60,
)
async with create_mcp_server_session(params) as session:
await session.initialize()
tools = await mcp_server_tools(server_params=params, session=session)
print(f"Tools: {[tool.name for tool in tools]}")
agent = AssistantAgent(
name="Assistant",
model_client=model_client,
tools=tools, # type: ignore
)
termination = TextMentionTermination("TERMINATE")
team = RoundRobinGroupChat([agent], termination_condition=termination)
await Console(
team.run_stream(
task="Go to https://ekzhu.com/, visit the first link in the page, then tell me about the linked page."
)
)
asyncio.run(main())
```
Based on discussion in this thread: #6284, we will consider
serialization and deserialization of MCP server tools when used in this
manner in a separate issue.
This PR also replaces the `json_schema_to_pydantic` dependency with
built-in utils.
Resolves#5745
Also made sure to log LLMCallEvent from all builtin model clients, and
added unit test for coverage.
---------
Co-authored-by: Ryan Sweet <rysweet@microsoft.com>
Co-authored-by: Victor Dibia <victordibia@microsoft.com>