"<a href=\"https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_function_call.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "9a71fa36",
"metadata": {},
"source": [
"# Auto Generated Agent Chat: Task Solving with Provided Tools as Functions\n",
"\n",
"AutoGen offers conversable agents powered by LLM, tool, or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation. Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"In this notebook, we demonstrate how to use `AssistantAgent` and `UserProxyAgent` to make function calls with the new feature of OpenAI models (in model version 0613). A specified prompt and function configs must be passed to `AssistantAgent` to initialize the agent. The corresponding functions must be passed to `UserProxyAgent`, which will execute any function calls made by `AssistantAgent`. Besides this requirement of matching descriptions with functions, we recommend checking the system message in the `AssistantAgent` to ensure the instructions align with the function call descriptions.\n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install the [mathchat] option since we will import functions from `MathUserProxyAgent`:\n",
"```bash\n",
"pip install \"pyautogen[mathchat]\"\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "2b803c17",
"metadata": {},
"outputs": [],
"source": [
"# %pip install \"pyautogen[mathchat]~=0.1.0\""
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "5ebd2397",
"metadata": {},
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_models`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_models) function tries to create a list of configurations using Azure OpenAI endpoints and OpenAI endpoints for the provided list of models. It assumes the api keys and api bases are stored in the corresponding environment variables or local txt files:\n",
"\n",
"- OpenAI API key: os.environ[\"OPENAI_API_KEY\"] or `openai_api_key_file=\"key_openai.txt\"`.\n",
"- Azure OpenAI API key: os.environ[\"AZURE_OPENAI_API_KEY\"] or `aoai_api_key_file=\"key_aoai.txt\"`. Multiple keys can be stored, one per line.\n",
"- Azure OpenAI API base: os.environ[\"AZURE_OPENAI_API_BASE\"] or `aoai_api_base_file=\"base_aoai.txt\"`. Multiple bases can be stored, one per line.\n",
"\n",
"It's OK to have only the OpenAI API key, or only the Azure OpenAI API key + base.\n",
"If you open this notebook in google colab, you can upload your files by clicking the file icon on the left panel and then choosing \"upload file\" icon.\n",
"\n",
"The following code excludes Azure OpenAI endpoints from the config list because some endpoints don't support functions yet. Remove the `exclude` argument if they do."
" }, # OpenAI API endpoint for gpt-3.5-turbo-16k\n",
"]\n",
"```\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "2b9526e7",
"metadata": {},
"source": [
"## Making Async and Sync Function Calls\n",
"\n",
"In this example, we demonstrate function call execution with `AssistantAgent` and `UserProxyAgent`. With the default system prompt of `AssistantAgent`, we allow the LLM assistant to perform tasks with code, and the `UserProxyAgent` would extract code blocks from the LLM response and execute them. With the new \"function_call\" feature, we define functions and specify the description of the function in the OpenAI config for the `AssistantAgent`. Then we register the functions in `UserProxyAgent`."