autogen/notebook/agentchat_function_call_async.ipynb
Eric Zhu 22e36cbb10
Support for Python 3.12 (#1317)
* support for Python 3.12

* add python 3.12 to workflows

* version string fix

* retrieval chat

* teachability

* workflow

* redistribute notebook tests

* fix incorrect notebook introduction

* update banner; remove unused imports; update openai workflow

---------

Co-authored-by: Chi Wang <wang.chi@microsoft.com>
2024-01-18 07:15:06 +00:00

399 lines
16 KiB
Plaintext

{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "ae1f50ec",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_function_call.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "9a71fa36",
"metadata": {},
"source": [
"# Auto Generated Agent Chat: Task Solving with Provided Tools as Functions (Asynchronous Function Calls)\n",
"\n",
"AutoGen offers conversable agents powered by LLM, tool, or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation. Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"In this notebook, we demonstrate how to use `AssistantAgent` and `UserProxyAgent` to make function calls with the new feature of OpenAI models (in model version 0613). A specified prompt and function configs must be passed to `AssistantAgent` to initialize the agent. The corresponding functions must be passed to `UserProxyAgent`, which will execute any function calls made by `AssistantAgent`. Besides this requirement of matching descriptions with functions, we recommend checking the system message in the `AssistantAgent` to ensure the instructions align with the function call descriptions.\n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install `pyautogen`:\n",
"```bash\n",
"pip install pyautogen\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "2b803c17",
"metadata": {},
"outputs": [],
"source": [
"# %pip install \"pyautogen>=0.2.6\""
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "5ebd2397",
"metadata": {},
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_models`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_models) function tries to create a list of configurations using Azure OpenAI endpoints and OpenAI endpoints for the provided list of models. It assumes the api keys and api bases are stored in the corresponding environment variables or local txt files:\n",
"\n",
"- OpenAI API key: os.environ[\"OPENAI_API_KEY\"] or `openai_api_key_file=\"key_openai.txt\"`.\n",
"- Azure OpenAI API key: os.environ[\"AZURE_OPENAI_API_KEY\"] or `aoai_api_key_file=\"key_aoai.txt\"`. Multiple keys can be stored, one per line.\n",
"- Azure OpenAI API base: os.environ[\"AZURE_OPENAI_API_BASE\"] or `aoai_api_base_file=\"base_aoai.txt\"`. Multiple bases can be stored, one per line.\n",
"\n",
"It's OK to have only the OpenAI API key, or only the Azure OpenAI API key + base.\n",
"If you open this notebook in google colab, you can upload your files by clicking the file icon on the left panel and then choosing \"upload file\" icon.\n",
"\n",
"The following code excludes Azure OpenAI endpoints from the config list because some endpoints don't support functions yet. Remove the `exclude` argument if they do."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "dca301a4",
"metadata": {},
"outputs": [],
"source": [
"import time\n",
"\n",
"from typing_extensions import Annotated\n",
"\n",
"import autogen\n",
"\n",
"config_list = autogen.config_list_from_json(\n",
" \"OAI_CONFIG_LIST\",\n",
" file_location=\".\",\n",
" filter_dict={\n",
" \"model\": [\"gpt-4\"],\n",
" },\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "92fde41f",
"metadata": {},
"source": [
"The config list looks like the following:\n",
"```python\n",
"config_list = [\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your OpenAI API key here>',\n",
" }, # OpenAI API endpoint for gpt-4\n",
" {\n",
" 'model': 'gpt-3.5-turbo',\n",
" 'api_key': '<your OpenAI API key here>',\n",
" }, # OpenAI API endpoint for gpt-3.5-turbo\n",
" {\n",
" 'model': 'gpt-3.5-turbo-16k',\n",
" 'api_key': '<your OpenAI API key here>',\n",
" }, # OpenAI API endpoint for gpt-3.5-turbo-16k\n",
"]\n",
"```\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "2b9526e7",
"metadata": {},
"source": [
"## Making Async and Sync Function Calls\n",
"\n",
"In this example, we demonstrate function call execution with `AssistantAgent` and `UserProxyAgent`. With the default system prompt of `AssistantAgent`, we allow the LLM assistant to perform tasks with code, and the `UserProxyAgent` would extract code blocks from the LLM response and execute them. With the new \"function_call\" feature, we define functions and specify the description of the function in the OpenAI config for the `AssistantAgent`. Then we register the functions in `UserProxyAgent`."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "9fb85afb",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33muser_proxy\u001b[0m (to chatbot):\n",
"\n",
"Create a timer for 5 seconds and then a stopwatch for 5 seconds.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mchatbot\u001b[0m (to user_proxy):\n",
"\n",
"\u001b[32m***** Suggested tool Call (call_fGgH8U261nOnx3JGNJWslhh6): timer *****\u001b[0m\n",
"Arguments: \n",
"{\"num_seconds\":\"5\"}\n",
"\u001b[32m**********************************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[35m\n",
">>>>>>>> EXECUTING ASYNC FUNCTION timer...\u001b[0m\n",
"\u001b[33muser_proxy\u001b[0m (to chatbot):\n",
"\n",
"\u001b[33muser_proxy\u001b[0m (to chatbot):\n",
"\n",
"\u001b[32m***** Response from calling tool \"timer\" *****\u001b[0m\n",
"Timer is done!\n",
"\u001b[32m**********************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mchatbot\u001b[0m (to user_proxy):\n",
"\n",
"\u001b[32m***** Suggested tool Call (call_BZs6ynF8gtcZKhONiIRZkECB): stopwatch *****\u001b[0m\n",
"Arguments: \n",
"{\"num_seconds\":\"5\"}\n",
"\u001b[32m**************************************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[35m\n",
">>>>>>>> EXECUTING ASYNC FUNCTION stopwatch...\u001b[0m\n",
"\u001b[33muser_proxy\u001b[0m (to chatbot):\n",
"\n",
"\u001b[33muser_proxy\u001b[0m (to chatbot):\n",
"\n",
"\u001b[32m***** Response from calling tool \"stopwatch\" *****\u001b[0m\n",
"Stopwatch is done!\n",
"\u001b[32m**************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mchatbot\u001b[0m (to user_proxy):\n",
"\n",
"TERMINATE\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
}
],
"source": [
"llm_config = {\n",
" \"config_list\": config_list,\n",
"}\n",
"\n",
"coder = autogen.AssistantAgent(\n",
" name=\"chatbot\",\n",
" system_message=\"For coding tasks, only use the functions you have been provided with. You have a stopwatch and a timer, these tools can and should be used in parallel. Reply TERMINATE when the task is done.\",\n",
" llm_config=llm_config,\n",
")\n",
"\n",
"# create a UserProxyAgent instance named \"user_proxy\"\n",
"user_proxy = autogen.UserProxyAgent(\n",
" name=\"user_proxy\",\n",
" system_message=\"A proxy for the user for executing code.\",\n",
" is_termination_msg=lambda x: x.get(\"content\", \"\") and x.get(\"content\", \"\").rstrip().endswith(\"TERMINATE\"),\n",
" human_input_mode=\"NEVER\",\n",
" max_consecutive_auto_reply=10,\n",
" code_execution_config={\"work_dir\": \"coding\"},\n",
")\n",
"\n",
"# define functions according to the function description\n",
"\n",
"# An example async function\n",
"\n",
"\n",
"@user_proxy.register_for_execution()\n",
"@coder.register_for_llm(description=\"create a timer for N seconds\")\n",
"async def timer(num_seconds: Annotated[str, \"Number of seconds in the timer.\"]) -> str:\n",
" for i in range(int(num_seconds)):\n",
" time.sleep(1)\n",
" # should print to stdout\n",
" return \"Timer is done!\"\n",
"\n",
"\n",
"# An example sync function\n",
"@user_proxy.register_for_execution()\n",
"@coder.register_for_llm(description=\"create a stopwatch for N seconds\")\n",
"def stopwatch(num_seconds: Annotated[str, \"Number of seconds in the stopwatch.\"]) -> str:\n",
" for i in range(int(num_seconds)):\n",
" time.sleep(1)\n",
" return \"Stopwatch is done!\"\n",
"\n",
"\n",
"# start the conversation\n",
"# 'await' is used to pause and resume code execution for async IO operations.\n",
"# Without 'await', an async function returns a coroutine object but doesn't execute the function.\n",
"# With 'await', the async function is executed and the current function is paused until the awaited function returns a result.\n",
"await user_proxy.a_initiate_chat( # noqa: F704\n",
" coder,\n",
" message=\"Create a timer for 5 seconds and then a stopwatch for 5 seconds.\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "950f3de7",
"metadata": {},
"source": [
"# Async Function Call with Group Chat\n",
"Sync and async can be used in topologies beyond two agents. Below, we show this feature for a group chat."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "2472f95c",
"metadata": {},
"outputs": [],
"source": [
"markdownagent = autogen.AssistantAgent(\n",
" name=\"Markdown_agent\",\n",
" system_message=\"Respond in markdown only\",\n",
" llm_config=llm_config,\n",
")\n",
"\n",
"# Add a function for robust group chat termination\n",
"\n",
"\n",
"@user_proxy.register_for_execution()\n",
"@markdownagent.register_for_llm()\n",
"@coder.register_for_llm(description=\"terminate the group chat\")\n",
"def terminate_group_chat(message: Annotated[str, \"Message to be sent to the group chat.\"]) -> str:\n",
" return f\"[GROUPCHAT_TERMINATE] {message}\"\n",
"\n",
"\n",
"groupchat = autogen.GroupChat(agents=[user_proxy, coder, markdownagent], messages=[], max_round=12)\n",
"\n",
"llm_config_manager = llm_config.copy()\n",
"llm_config_manager.pop(\"functions\", None)\n",
"\n",
"manager = autogen.GroupChatManager(\n",
" groupchat=groupchat,\n",
" llm_config=llm_config_manager,\n",
" is_termination_msg=lambda x: \"GROUPCHAT_TERMINATE\" in x.get(\"content\", \"\"),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "e2c9267a",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33muser_proxy\u001b[0m (to chat_manager):\n",
"\n",
"\n",
"1) Create a timer and a stopwatch for 5 seconds each in parallel.\n",
"2) Pretty print the result as md.\n",
"3) when 1 and 2 are done, terminate the group chat\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mchatbot\u001b[0m (to chat_manager):\n",
"\n",
"\u001b[32m***** Suggested tool Call (call_zlHKR9LBzCqs1iLId5kvNvJ5): timer *****\u001b[0m\n",
"Arguments: \n",
"{\"num_seconds\": \"5\"}\n",
"\u001b[32m**********************************************************************\u001b[0m\n",
"\u001b[32m***** Suggested tool Call (call_rH1dgbS9itiJO1Gwnxxhcm35): stopwatch *****\u001b[0m\n",
"Arguments: \n",
"{\"num_seconds\": \"5\"}\n",
"\u001b[32m**************************************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[35m\n",
">>>>>>>> EXECUTING ASYNC FUNCTION timer...\u001b[0m\n",
"\u001b[35m\n",
">>>>>>>> EXECUTING ASYNC FUNCTION stopwatch...\u001b[0m\n",
"\u001b[33muser_proxy\u001b[0m (to chat_manager):\n",
"\n",
"\u001b[33muser_proxy\u001b[0m (to chat_manager):\n",
"\n",
"\u001b[32m***** Response from calling tool \"timer\" *****\u001b[0m\n",
"Timer is done!\n",
"\u001b[32m**********************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33muser_proxy\u001b[0m (to chat_manager):\n",
"\n",
"\u001b[32m***** Response from calling tool \"stopwatch\" *****\u001b[0m\n",
"Stopwatch is done!\n",
"\u001b[32m**************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mMarkdown_agent\u001b[0m (to chat_manager):\n",
"\n",
"The results of the timer and stopwatch are as follows:\n",
"\n",
"- Timer: Timer is done!\n",
"- Stopwatch: Stopwatch is done!\n",
"\n",
"Now, I will proceed to terminate the group chat.\n",
"\u001b[32m***** Suggested tool Call (call_3Js7oU80vPatnA8IiaKXB5Xu): terminate_group_chat *****\u001b[0m\n",
"Arguments: \n",
"{\"message\":\"The session has concluded, and the group chat will now be terminated.\"}\n",
"\u001b[32m*************************************************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[35m\n",
">>>>>>>> EXECUTING ASYNC FUNCTION terminate_group_chat...\u001b[0m\n",
"\u001b[33muser_proxy\u001b[0m (to chat_manager):\n",
"\n",
"\u001b[33muser_proxy\u001b[0m (to chat_manager):\n",
"\n",
"\u001b[32m***** Response from calling tool \"terminate_group_chat\" *****\u001b[0m\n",
"[GROUPCHAT_TERMINATE] The session has concluded, and the group chat will now be terminated.\n",
"\u001b[32m*************************************************************\u001b[0m\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
}
],
"source": [
"# todo: remove comment after fixing https://github.com/microsoft/autogen/issues/1205\n",
"await user_proxy.a_initiate_chat( # noqa: F704\n",
" manager,\n",
" message=\"\"\"\n",
"1) Create a timer and a stopwatch for 5 seconds each in parallel.\n",
"2) Pretty print the result as md.\n",
"3) when 1 and 2 are done, terminate the group chat\"\"\",\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6d074e51",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "flaml_dev",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.5"
}
},
"nbformat": 4,
"nbformat_minor": 5
}