Add Documentation for AgentChat (#3635)

* update docs on agent chat.

* add langchain support

* fix formatting issues

* Update python/packages/autogen-core/docs/src/agentchat-user-guide/index.md

Co-authored-by: gagb <gagb@users.noreply.github.com>

* Update python/packages/autogen-core/docs/src/agentchat-user-guide/index.md

Co-authored-by: gagb <gagb@users.noreply.github.com>

* add company reseach and literature review examples with tools

* format fixes

* format and type fixes

* add selector groupchat to agentchat index page

* rename quick start as code exec

* type fixes

* format fix

* Remove blank cell from notebooks

---------

Co-authored-by: gagb <gagb@users.noreply.github.com>
Co-authored-by: Eric Zhu <ekzhu@users.noreply.github.com>
This commit is contained in:
Victor Dibia 2024-10-08 14:35:54 -07:00 committed by GitHub
parent 02ced7c6b3
commit 4290cfa258
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
10 changed files with 1442 additions and 117 deletions

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,54 @@
---
myst:
html_meta:
"description lang=en": |
Examples built using AgentChat, a high-level api for AutoGen
---
# Examples
A list of examples to help you get started with AgentChat.
:::::{grid} 2 2 2 3
::::{grid-item-card} Travel Planning
:img-top: ../../images/code.svg
:img-alt: travel planning example
:link: ./travel-planning.html
^^^
Generating a travel plan using multiple agents.
::::
::::{grid-item-card} Company Research
:img-top: ../../images/code.svg
:img-alt: company research example
:link: ./company-research.html
^^^
Generating a company research report using multiple agents with tools.
::::
::::{grid-item-card} Literature Review
:img-top: ../../images/code.svg
:img-alt: literature review example
:link: ./literature-review.html
^^^
Generating a literature review using agents with tools.
::::
:::::
```{toctree}
:maxdepth: 1
:hidden:
travel-planning
company-research
literature-review
```

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -4,7 +4,10 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Quick Start\n",
"# Code Execution\n",
"\n",
"\n",
"AgentChat offers a `CodeExecutorAgent` agent that can execute code in messages it receives. \n",
"\n",
":::{note}\n",
"See [here](pkg-info-autogen-agentchat) for installation instructions.\n",

View File

@ -1,118 +1,236 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Tool Use\n",
"\n",
":::{note}\n",
"See [here](pkg-info-autogen-agentchat) for installation instructions.\n",
":::\n",
"\n",
":::{warning}\n",
"🚧 Under construction 🚧\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from autogen_agentchat.agents import ToolUseAssistantAgent\n",
"from autogen_agentchat.teams.group_chat import RoundRobinGroupChat\n",
"from autogen_core.components.models import OpenAIChatCompletionClient\n",
"from autogen_core.components.tools import FunctionTool"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"async def get_weather(city: str) -> str:\n",
" return \"Sunny\"\n",
"\n",
"\n",
"get_weather_tool = FunctionTool(get_weather, description=\"Get the weather for a city\")"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
"cells": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-04T17:59:55.737430]:\u001b[0m\n",
"\n",
"What's the weather in New York?\n",
"From: user"
]
"cell_type": "markdown",
"metadata": {},
"source": [
"# Tool Use\n",
"\n",
"The `AgentChat` api provides a `ToolUseAssistantAgent` with presets for adding tools that the agent can call as part of it's response. \n",
"\n",
":::{note}\n",
"\n",
"The example presented here is a work in progress 🚧. Also, tool uses here assumed the `model_client` used by the agent supports tool calling. \n",
"::: "
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-04T17:59:56.310787], Weather_Assistant:\u001b[0m\n",
"\n",
"[FunctionCall(id='call_zxmdHPEQ1QMd2NwvYUSgxxDV', arguments='{\"city\":\"New York\"}', name='get_weather')]\n",
"From: Weather_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-04T17:59:56.312084], tool_agent_for_Weather_Assistant:\u001b[0m\n",
"\n",
"[FunctionExecutionResult(content='Sunny', call_id='call_zxmdHPEQ1QMd2NwvYUSgxxDV')]\n",
"From: tool_agent_for_Weather_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-04T17:59:56.767874], Weather_Assistant:\u001b[0m\n",
"\n",
"The weather in New York is sunny. \n",
"\n",
"TERMINATE\n",
"From: Weather_Assistant"
]
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"from autogen_agentchat.agents import ToolUseAssistantAgent\n",
"from autogen_agentchat.teams.group_chat import RoundRobinGroupChat\n",
"from autogen_core.components.models import OpenAIChatCompletionClient\n",
"from autogen_core.components.tools import FunctionTool"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In AgentChat, a Tool is a function wrapped in the `FunctionTool` class exported from `autogen_core.components.tools`. "
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"async def get_weather(city: str) -> str:\n",
" return f\"The weather in {city} is 72 degrees and Sunny.\"\n",
"\n",
"\n",
"get_weather_tool = FunctionTool(get_weather, description=\"Get the weather for a city\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, agents that use tools are defined in the following manner. \n",
"\n",
"- An agent is instantiated based on the `ToolUseAssistantAgent` class in AgentChat. The agent is aware of the tools it can use by passing a `tools_schema` attribute to the class, which is passed to the `model_client` when the agent generates a response.\n",
"- An agent Team is defined that takes a list of `tools`. Effectively, the `ToolUseAssistantAgent` can generate messages that call tools, and the team is responsible executing those tool calls and returning the results."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:50:13.202461]:\u001b[0m\n",
"\n",
"What's the weather in New York?\n",
"From: user\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:50:14.090696], Weather_Assistant:\u001b[0m\n",
"\n",
"[FunctionCall(id='call_wqkaIBdYjWklWG0GQkYz7FZ0', arguments='{\"city\":\"New York\"}', name='get_weather')]\n",
"From: Weather_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:50:14.092050], tool_agent_for_Weather_Assistant:\u001b[0m\n",
"\n",
"[FunctionExecutionResult(content='The weather in New York is 72 degrees and Sunny.', call_id='call_wqkaIBdYjWklWG0GQkYz7FZ0')]\n",
"From: tool_agent_for_Weather_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:50:14.714470], Weather_Assistant:\u001b[0m\n",
"\n",
"The weather in New York is 72 degrees and sunny. \n",
"\n",
"TERMINATE\n",
"From: Weather_Assistant"
]
}
],
"source": [
"assistant = ToolUseAssistantAgent(\n",
" \"Weather_Assistant\",\n",
" model_client=OpenAIChatCompletionClient(model=\"gpt-4o-mini\"),\n",
" registered_tools=[get_weather_tool],\n",
")\n",
"team = RoundRobinGroupChat([assistant])\n",
"result = await team.run(\"What's the weather in New York?\")\n",
"# print(result)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Using Langchain Tools \n",
"\n",
"AutoGen also provides direct support for tools from LangChain via the `autogen_ext` package.\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# pip install langchain, langchain-community, wikipedia , autogen-ext\n",
"\n",
"from autogen_ext.tools.langchain import LangChainToolAdapter\n",
"from langchain.tools import WikipediaQueryRun\n",
"from langchain_community.utilities import WikipediaAPIWrapper\n",
"\n",
"api_wrapper = WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=100)\n",
"tool = WikipediaQueryRun(api_wrapper=api_wrapper)\n",
"\n",
"langchain_wikipedia_tool = LangChainToolAdapter(tool)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:36.869317]:\u001b[0m\n",
"\n",
"Who is the receipient of the 2023 Nobel Prize in Physics?\n",
"From: user"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:37.856066], WikiPedia_Assistant:\u001b[0m\n",
"\n",
"[FunctionCall(id='call_bdLqS1msbHCy5IMGYaata5vs', arguments='{\"query\":\"2023 Nobel Prize in Physics\"}', name='wikipedia')]\n",
"From: WikiPedia_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:38.518288], tool_agent_for_WikiPedia_Assistant:\u001b[0m\n",
"\n",
"[FunctionExecutionResult(content='Page: Nobel Prize in Physics\\nSummary: The Nobel Prize in Physics (Swedish: Nobelpriset i fysik) is a', call_id='call_bdLqS1msbHCy5IMGYaata5vs')]\n",
"From: tool_agent_for_WikiPedia_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:39.070911], WikiPedia_Assistant:\u001b[0m\n",
"\n",
"[FunctionCall(id='call_BFXGGeuBbOQ1LPb4f0NiNva2', arguments='{\"query\":\"2023 Nobel Prize in Physics recipients\"}', name='wikipedia')]\n",
"From: WikiPedia_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:39.727147], tool_agent_for_WikiPedia_Assistant:\u001b[0m\n",
"\n",
"[FunctionExecutionResult(content='Page: Nobel Prize in Physics\\nSummary: The Nobel Prize in Physics (Swedish: Nobelpriset i fysik) is a', call_id='call_BFXGGeuBbOQ1LPb4f0NiNva2')]\n",
"From: tool_agent_for_WikiPedia_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:40.746467], WikiPedia_Assistant:\u001b[0m\n",
"\n",
"[FunctionCall(id='call_iH2gkY5A2LiQTiy2eh86XpP5', arguments='{\"query\": \"2023 Nobel Prize in Physics winners\"}', name='wikipedia'), FunctionCall(id='call_rJXgJQiAKoD7yrymNJCsQA9N', arguments='{\"query\": \"Nobel Prize in Physics\"}', name='wikipedia')]\n",
"From: WikiPedia_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:41.469348], tool_agent_for_WikiPedia_Assistant:\u001b[0m\n",
"\n",
"[FunctionExecutionResult(content='Page: Nobel Prize in Physics\\nSummary: The Nobel Prize in Physics (Swedish: Nobelpriset i fysik) is a', call_id='call_iH2gkY5A2LiQTiy2eh86XpP5'), FunctionExecutionResult(content='Page: Nobel Prize in Physics\\nSummary: The Nobel Prize in Physics (Swedish: Nobelpriset i fysik) is a', call_id='call_rJXgJQiAKoD7yrymNJCsQA9N')]\n",
"From: tool_agent_for_WikiPedia_Assistant\n",
"--------------------------------------------------------------------------- \n",
"\u001b[91m[2024-10-08T09:51:42.576718], WikiPedia_Assistant:\u001b[0m\n",
"\n",
"I couldn't find specific information about the recipients of the 2023 Nobel Prize in Physics. You might want to check a reliable news source or the official Nobel Prize website for the most accurate and up-to-date details. \n",
"\n",
"TERMINATE\n",
"From: WikiPedia_Assistant"
]
}
],
"source": [
"wikipedia_assistant = ToolUseAssistantAgent(\n",
" \"WikiPedia_Assistant\",\n",
" model_client=OpenAIChatCompletionClient(model=\"gpt-4o-mini\"),\n",
" registered_tools=[langchain_wikipedia_tool],\n",
")\n",
"team = RoundRobinGroupChat([wikipedia_assistant])\n",
"result = await team.run(\"Who was the first president of the United States?\")\n",
"\n",
"# print(result)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "agnext",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
}
],
"source": [
"assistant = ToolUseAssistantAgent(\n",
" \"Weather_Assistant\",\n",
" model_client=OpenAIChatCompletionClient(model=\"gpt-4o-mini\"),\n",
" registered_tools=[get_weather_tool],\n",
")\n",
"team = RoundRobinGroupChat([assistant])\n",
"result = await team.run(\"What's the weather in New York?\")\n",
"# print(result)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.6"
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -2,21 +2,59 @@
myst:
html_meta:
"description lang=en": |
User Guide for AutoGen AgentChat, a framework for building multi-agent applications with AI agents.
User Guide for AgentChat, a high-level api for AutoGen
---
# AgentChat
```{warning}
🚧 Under construction 🚧
AgentChat is a high-level package for building multi-agent applications built on top of the [ `autogen-core`](../core-user-guide/index.md) package. For beginner users, AgentChat is the recommended starting point. For advanced users, [ `autogen-core`](../core-user-guide/index.md) provides more flexibility and control over the underlying components.
AgentChat aims to provide intuitive defaults, such as **Agents** with preset behaviors and **Teams** with predefined communication protocols, to simplify building multi-agent applications.
```{tip}
If you are interested in implementing complex agent interaction behaviours, defining custom messaging protocols, or orchestration mechanisms, consider using the [ `autogen-core`](../core-user-guide/index.md) package.
```
## Agents
Agents provide presets for how an agent might respond to received messages. The following Agents are currently supported:
- `CodingAssistantAgent` - Generates responses using an LLM on receipt of a message
- `CodeExecutionAgent` - Extracts and executes code snippets found in received messages and returns the output
- `ToolUseAssistantAgent` - Responds with tool call messages based on received messages and a list of tool schemas provided at initialization
## Teams
Teams define how groups of agents communicate to address tasks. The following Teams are currently supported:
- `RoundRobinGroupChat` - A team where agents take turns sending messages (in a round robin fashion) until a termination condition is met
- `SelectorGroupChat` - A team where a model is used to select the next agent to send a message based on the current conversation history.
```{toctree}
:caption: Getting Started
:maxdepth: 1
:maxdepth: 2
:hidden:
guides/quickstart
quickstart
```
```{toctree}
:caption: Guides
:maxdepth: 2
:hidden:
guides/code-execution
guides/tool_use
guides/selector-group-chat
```
```{toctree}
:caption: Examples
:maxdepth: 3
:hidden:
examples/index
```

View File

@ -0,0 +1,45 @@
---
myst:
html_meta:
"description lang=en": |
Quick Start Guide for AgentChat: Migrating from AutoGen 0.2x to 0.4x.
---
# Quick Start
AgentChat API, introduced in AutoGen 0.4x, offers a similar level of abstraction as the default Agent classes in AutoGen 0.2x.
## Installation
Install the `autogen-agentchat` package using pip:
```bash
pip install autogen-agentchat==0.4.0dev0
```
:::{note}
For further installation instructions, please refer to the [package information](pkg-info-autogen-agentchat).
:::
## Creating a Simple Agent Team
The following example illustrates creating a simple agent team with two agents that interact to solve a task.
1. `CodingAssistantAgent` that generates responses using an LLM model. 2.`CodeExecutorAgent` that executes code snippets and returns the output.
The task is to "Create a plot of NVIDIA and TESLA stock returns YTD from 2024-01-01 and save it to 'nvidia_tesla_2024_ytd.png'."
```{include} stocksnippet.md
```
```{tip}
AgentChat in v0.4x provides similar abstractions to the default agents in v0.2x. The `CodingAssistantAgent` and `CodeExecutorAgent` in v0.4x are equivalent to the `AssistantAgent` and `UserProxyAgent` with code execution in v0.2x.
```
If you are exploring migrating your code from AutoGen 0.2x to 0.4x, the following are some key differences to consider:
1. In v0.4x, agent interactions are managed by `Teams` (e.g., `RoundRobinGroupChat`), replacing direct chat initiation.
2. v0.4x uses async/await syntax for improved performance and scalability.
3. Configuration in v0.4x is more modular, with separate components for code execution and LLM clients.

View File

@ -0,0 +1,40 @@
``````{tab-set}
`````{tab-item} AgentChat (v0.4x)
```python
from autogen_agentchat.agents import CodeExecutorAgent, CodingAssistantAgent
from autogen_agentchat.teams.group_chat import RoundRobinGroupChat
from autogen_core.components.code_executor import DockerCommandLineCodeExecutor
from autogen_core.components.models import OpenAIChatCompletionClient
async with DockerCommandLineCodeExecutor(work_dir="coding") as code_executor:
code_executor_agent = CodeExecutorAgent("code_executor", code_executor=code_executor)
coding_assistant_agent = CodingAssistantAgent(
"coding_assistant", model_client=OpenAIChatCompletionClient(model="gpt-4")
)
group_chat = RoundRobinGroupChat([coding_assistant_agent, code_executor_agent])
result = await group_chat.run(
task="Create a plot of NVIDIA and TESLA stock returns YTD from 2024-01-01 and save it to 'nvidia_tesla_2024_ytd.png'."
)
print(result)
```
`````
`````{tab-item} v0.2x
```python
from autogen import AssistantAgent, UserProxyAgent, config_list_from_json
config_list = config_list_from_json(env_or_file="OAI_CONFIG_LIST")
assistant = AssistantAgent("assistant", llm_config={"config_list": config_list})
code_executor_agent = UserProxyAgent(
"code_executor_agent",
code_execution_config={"work_dir": "coding", "use_docker": True}
)
code_executor_agent.initiate_chat(
assistant,
message="Create a plot of NVIDIA and TESLA stock returns YTD from 2024-01-01 and save it to 'nvidia_tesla_2024_ytd.png'."
)
```
`````
``````

View File

@ -0,0 +1,21 @@
<svg width="586" height="390" viewBox="0 0 586 390" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M578 1H8C4.13401 1 1 4.13401 1 8.00001V382C1 385.866 4.13402 389 8.00001 389H578C581.866 389 585 385.866 585 382V8C585 4.13401 581.866 1 578 1Z" fill="white"/>
<path d="M85.5 50.5H30.5C26.3643 50.5 23 47.1357 23 43C23 38.8643 26.3643 35.5 30.5 35.5H85.5C89.6357 35.5 93 38.8643 93 43C93 47.1357 89.6357 50.5 85.5 50.5Z" fill="#D8D8D8"/>
<path d="M125.5 114.5H30.5C26.3643 114.5 23 111.136 23 107C23 102.864 26.3643 99.5 30.5 99.5H125.5C129.636 99.5 133 102.864 133 107C133 111.136 129.636 114.5 125.5 114.5Z" fill="#D8D8D8"/>
<path d="M177 142.5H29C25.6914 142.5 23 139.809 23 136.5C23 133.191 25.6914 130.5 29 130.5H177C180.309 130.5 183 133.191 183 136.5C183 139.809 180.309 142.5 177 142.5Z" fill="#D8D8D8"/>
<path d="M177 160.5H29C25.6914 160.5 23 157.809 23 154.5C23 151.191 25.6914 148.5 29 148.5H177C180.309 148.5 183 151.191 183 154.5C183 157.809 180.309 160.5 177 160.5Z" fill="#D8D8D8"/>
<path d="M178.5 340.5H27.5C25.0185 340.5 23 338.481 23 336C23 333.519 25.0185 331.5 27.5 331.5H178.5C180.981 331.5 183 333.519 183 336C183 338.481 180.981 340.5 178.5 340.5Z" fill="#D8D8D8"/>
<path d="M178.5 358.5H27.5C25.0185 358.5 23 356.481 23 354C23 351.519 25.0185 349.5 27.5 349.5H178.5C180.981 349.5 183 351.519 183 354C183 356.481 180.981 358.5 178.5 358.5Z" fill="#D8D8D8"/>
<path d="M464.5 46.5H439.5C435.364 46.5 432 43.1357 432 39C432 34.8643 435.364 31.5 439.5 31.5H464.5C468.636 31.5 472 34.8643 472 39C472 43.1357 468.636 46.5 464.5 46.5Z" fill="#D8D8D8"/>
<path d="M55.5 189.5H30.5C26.3643 189.5 23 186.136 23 182C23 177.864 26.3643 174.5 30.5 174.5H55.5C59.6357 174.5 63 177.864 63 182C63 186.136 59.6357 189.5 55.5 189.5Z" fill="#D8D8D8"/>
<path d="M509.5 46.5H484.5C480.364 46.5 477 43.1357 477 39C477 34.8643 480.364 31.5 484.5 31.5H509.5C513.636 31.5 517 34.8643 517 39C517 43.1357 513.636 46.5 509.5 46.5Z" fill="#D8D8D8"/>
<path d="M554.5 46.5H529.5C525.364 46.5 522 43.1357 522 39C522 34.8643 525.364 31.5 529.5 31.5H554.5C558.636 31.5 562 34.8643 562 39C562 43.1357 558.636 46.5 554.5 46.5Z" fill="#D8D8D8"/>
<path d="M553.217 261.5H260.783C255.94 261.5 252 257.56 252 252.717V86.2832C252 81.4404 255.94 77.5 260.783 77.5H553.217C558.06 77.5 562 81.4404 562 86.2832V252.717C562 257.56 558.06 261.5 553.217 261.5Z" fill="#007BFF"/>
<path d="M103 325.5C95.0049 325.5 88.5 318.995 88.5 311C88.5 303.005 95.0049 296.5 103 296.5C110.995 296.5 117.5 303.005 117.5 311C117.5 318.995 110.995 325.5 103 325.5Z" fill="#D8D8D8"/>
<path d="M368.5 340.5H217.5C215.019 340.5 213 338.481 213 336C213 333.519 215.019 331.5 217.5 331.5H368.5C370.981 331.5 373 333.519 373 336C373 338.481 370.981 340.5 368.5 340.5Z" fill="#D8D8D8"/>
<path d="M368.5 358.5H217.5C215.019 358.5 213 356.481 213 354C213 351.519 215.019 349.5 217.5 349.5H368.5C370.981 349.5 373 351.519 373 354C373 356.481 370.981 358.5 368.5 358.5Z" fill="#D8D8D8"/>
<path d="M293 325.5C285.005 325.5 278.5 318.995 278.5 311C278.5 303.005 285.005 296.5 293 296.5C300.995 296.5 307.5 303.005 307.5 311C307.5 318.995 300.995 325.5 293 325.5Z" fill="#D8D8D8"/>
<path d="M558.5 340.5H407.5C405.019 340.5 403 338.481 403 336C403 333.519 405.019 331.5 407.5 331.5H558.5C560.981 331.5 563 333.519 563 336C563 338.481 560.981 340.5 558.5 340.5Z" fill="#D8D8D8"/>
<path d="M558.5 358.5H407.5C405.019 358.5 403 356.481 403 354C403 351.519 405.019 349.5 407.5 349.5H558.5C560.981 349.5 563 351.519 563 354C563 356.481 560.981 358.5 558.5 358.5Z" fill="#D8D8D8"/>
<path d="M483 325.5C475.005 325.5 468.5 318.995 468.5 311C468.5 303.005 475.005 296.5 483 296.5C490.995 296.5 497.5 303.005 497.5 311C497.5 318.995 490.995 325.5 483 325.5Z" fill="#D8D8D8"/>
</svg>

After

Width:  |  Height:  |  Size: 3.6 KiB