autogen/notebook/agentchat_oai_code_interpreter.ipynb

263 lines
183 KiB
Plaintext
Raw Normal View History

{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Auto Generated Agent Chat: GPTAssistant with Code Interpreter\n",
"The latest released Assistants API by OpenAI allows users to build AI assistants within their own applications. The Assistants API currently supports three types of tools: Code Interpreter, Retrieval, and Function calling. In this notebook, we demonstrate how to enable `GPTAssistantAgent` to use code interpreter. \n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install:\n",
"```bash\n",
"pip install pyautogen\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"import autogen\n",
"\n",
"config_list = autogen.config_list_from_json(\n",
" \"OAI_CONFIG_LIST\",\n",
" file_location=\".\",\n",
" filter_dict={\n",
" \"model\": [\"gpt-3.5-turbo\", \"gpt-35-turbo\", \"gpt-4\", \"gpt4\", \"gpt-4-32k\", \"gpt-4-turbo\"],\n",
" },\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"It first looks for environment variable \"OAI_CONFIG_LIST\" which needs to be a valid json string. If that variable is not found, it then looks for a json file named \"OAI_CONFIG_LIST\". It filters the configs by models (you can filter by other keys as well).\n",
"\n",
"The config list looks like the following:\n",
"```python\n",
"config_list = [\n",
" {\n",
" \"model\": \"gpt-4\",\n",
" \"api_key\": \"<your OpenAI API key>\",\n",
" }, # OpenAI API endpoint for gpt-4\n",
"]\n",
"```\n",
"\n",
"Currently Azure OpenAi does not support assistant api. You can set the value of config_list in any way you prefer. Please refer to this [notebook](https://github.com/microsoft/autogen/blob/main/notebook/oai_openai_utils.ipynb) for full code examples of the different methods."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Perform Tasks Using Code Interpreter\n",
"\n",
"We demonstrate task solving using `GPTAssistantAgent` with code interpreter. Pass `code_interpreter` in `tools` parameter to enable `GPTAssistantAgent` with code interpreter. It will write code and automatically execute it in a sandbox. The agent will receive the results from the sandbox environment and act accordingly.\n",
"\n",
"### Example 1: Math Problem Solving\n",
"In this example, we demonstrate how to use code interpreter to solve math problems.\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33muser_proxy\u001b[0m (to Coder Assistant):\n",
"\n",
"If $725x + 727y = 1500$ and $729x+ 731y = 1508$, what is the value of $x - y$ ?\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mCoder Assistant\u001b[0m (to user_proxy):\n",
"\n",
"The value of \\( x - y \\) is \\(-48\\). \n",
"\n",
"TERMINATE\n",
"\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
}
],
"source": [
"from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent\n",
"from autogen.agentchat import AssistantAgent, UserProxyAgent\n",
"\n",
"# Initiate an agent equipped with code interpreter\n",
"gpt_assistant = GPTAssistantAgent(\n",
" name=\"Coder Assistant\",\n",
" llm_config={\n",
" \"tools\": [\n",
" {\n",
" \"type\": \"code_interpreter\"\n",
" }\n",
" ],\n",
" \"config_list\": config_list,\n",
" },\n",
" instructions=\"You are an expert at solving math questions. Write code and run it to solve math problems. Reply TERMINATE when the task is solved and there is no problem.\",\n",
")\n",
"\n",
"user_proxy = UserProxyAgent(\n",
" name=\"user_proxy\",\n",
" is_termination_msg=lambda msg: \"TERMINATE\" in msg[\"content\"],\n",
" code_execution_config={\n",
" \"work_dir\": \"coding\",\n",
" \"use_docker\": False, # set to True or image name like \"python:3\" to use docker\n",
" },\n",
" human_input_mode=\"NEVER\"\n",
")\n",
"\n",
"# When all is set, initate the chat.\n",
"user_proxy.initiate_chat(gpt_assistant, message=\"If $725x + 727y = 1500$ and $729x+ 731y = 1508$, what is the value of $x - y$ ?\")\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Example 2: Plotting with Code Interpreter\n",
"\n",
"Code Interpreter can outputs files, such as generating image diagrams. In this example, we demonstrate how to draw figures and download it."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33muser_proxy\u001b[0m (to Coder Assistant):\n",
"\n",
"Draw a line chart to show the population trend in US. Show how you solved it with code.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mCoder Assistant\u001b[0m (to user_proxy):\n",
"\n",
"To create a line chart to show the population trend in the United States, we would need population data over a series of years. This data would generally come from sources like the United States Census Bureau or other governmental databases.\n",
"\n",
"Since we don't have internet access to fetch the latest data, I'll use a hypothetical set of data to demonstrate how you would create such a chart using Python and matplotlib, a common plotting library.\n",
"\n",
"First, I'll create some dummy data representing the US population from 1960 to 2020 at 10-year intervals. Then, I'll use this data to plot the line chart. Let's proceed with writing the code to create the line chart.\n",
"\n",
"\n",
"Recieved file id=file-TZbG4mSETWOeDqpidQsJCYVs\n",
"\n",
"Here's the line chart showing the hypothetical US population trend from 1960 to 2020 at 10-year intervals. The chart features markers at each data point and includes a grid for better readability.\n",
"\n",
"If you need the actual data or further detailed information in the chart, they'd typically be sourced from official databases or reputable statistical publications with access to the internet to fetch such data. \n",
"\n",
"If everything looks good and there are no further requests, please let me know.\n",
"\n",
"TERMINATE\n",
"\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
}
],
"source": [
"gpt_assistant = GPTAssistantAgent(\n",
" name=\"Coder Assistant\",\n",
" llm_config={\n",
" \"tools\": [\n",
" {\n",
" \"type\": \"code_interpreter\"\n",
" }\n",
" ],\n",
" \"config_list\": config_list,\n",
" },\n",
" instructions=\"You are an expert at writing python code to solve problems. Reply TERMINATE when the task is solved and there is no problem.\",\n",
")\n",
"\n",
"user_proxy.initiate_chat(gpt_assistant, message=\"Draw a line chart to show the population trend in US. Show how you solved it with code.\", clear_history=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we have the file id. We can download and display it."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABqsAAARNCAYAAADFD3rWAAEAAElEQVR4AezdB3wUZfrA8SeBAKFI6KEZBRFBJEAAaSKi2BX1LKd3du/+Fuyeinji2fFsWE9PjvM8ezkBO8IJKj10hVClhx5KIAGS/PcZb9bZ2dnN7mZ3M7v7ez8fzOzMO++87/fddxPn2fedtApPEhICCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAAC1SCQXg3X5JIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIGAIEq3gjIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIVJsAwapqo+fCCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACBKt4DyCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCFSbAMGqaqPnwggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAgSreA8ggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAghUmwDBqmqj58IIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIEq3gPIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIVJsAwapqo+fCCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACBKt4DyCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCFSbAMGqaqPnwggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAgSreA8ggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAghUmwDBqmqj58IIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIEq3gPIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIVJsAwapqo+fCCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACBKt4DyCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCFSbAMGqaqPnwggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAgSreA8ggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAghUmwDBqmqj58IIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAI1IUAAAQQQQAABBFJBoKKiQtasWSPLly+XXbt2yZ49e4xmZ2ZmymGHHSYtW7aU1q1bS/PmzVOBgzZGKHDOOefIpk2bvGePHz9eWrVq5X2dDBt//OMfZe7cud6m/O1vf5OePXt6X7OBQLIKzJkzR66//npv83r06CGvvfaa93WsNwoLC+XCCy+UkpIS41J33XWX/Pa3v431Zatc/qFDh2Tx4sWyfv16KSoqkoMHD0qDBg3kiCOOkC5dukidOnWqfI1ABei1ly5dKqtXr5adO3fKgQMHjOs1bdpU2rRpIx06dJDatWsHOj3k/eXl5bJkyRJZsWKFcR39m6Jhw4bSrl07o401a6burZVt27YZLvq7cffu3aI22v9NmjSRzp07x+zvKvN9t2rVKuPvurS0NGnUqJEcddRR0qlTJ0lPj853s/U6a9euFb3O9u3bpbi4WGrVqmX0v/7+P/bYY2P2Htf39MKFC2XDhg2yb98+4zr696peMzs7O+T3rzXjP//5T3nxxReNXfXr15ePP/5YGjdubM3CNgIIIIAAAtUmkLp/UVUbORdGAAEEEEAgeQU2btwo5557rk8Dq3qj+8EHH5RPP/3UW2a4Nw9/+ukn+eijj2TSpEmyd+9ebzmBNvR/2PUmR15envTr18+46REob1X329sWqLwaNWqI3lAwb/7pTYoBAwYY9Qx0DvsRQCB+AvYgR6yvXNXP1VjXj/IjE3j66ae9gSr98sRvfvObsArSoIH+zvvxxx+Nn7qtX86wpmi+d/QG/t///neZOnWqcQPfeh1zWwNFJ598slxzzTVG8MrcX9WfBQUF8s4778h///vfgNfWa2gQ6ZhjjjF+Z1522WVSt27dsC6tAYI333zT+Dtix44djufq72f9IoO2UYMlyZ70bynt85kzZ8rs2bNly5YtQZus7+WhQ4fKBRdcIFlZWUHzhnJQAzj/+Mc/ZMKECQH/rtNAmV7v8ssvD7vPtQ4a/JwyZYrRvvnz50tpaWnAqunfaPq36cUXXywnnnhiVIJk+oWRMWPGGNfXQKlTOu644+TKK6+UQYMGOR0OuE8D4O+9955s3brV8HvhhRdk5MiRAfNzAAEEEEAAgXgKEKyKpzbXQgABBBBAAIG4CejMKb3xZw10hXJxvRn1ww8/GP+ef/55GTFihJx//vmhnBqzPGVlZcYNR73pqN9c//777+XVV18VvVFx6623Srdu3WJ2bQpOHAG9efvtt996K9yxY8ewb2J5T2YDAQTiKjBr1iwj8GJeVGd4ZWRkmC8D/nz//feNG9oamNq8eXPAfNE8oDNndMaZBgz091OwpDf5P//8c5k4caLcdNNN8vvf/z5Y9kqP7d+/X5555hn55JNPjBk8lZ1gzr7RmV+nnHJKWAEzNb377rtFZ7wFSxq80cCZtvPhhx82vugSLH+iHlNL9Zg+fboxey7UduisoJdfftkI+t1zzz1y+umnh3qqX75p06bJn//8Z78grD2jzoDSQKoGtJ588kljhpc9j9NrDUw98cQTxkwxp+NO+3QMaNBO/3Xt2lUeeughY1afU97K9qnxs88+awSTKsu7aNEi0dmXQ4YMkQceeEB0pYBQks50vO666+Txxx83suvfyTqjU78IRUIAAQQQQKC6BaIzL7q6W8H1EUAAAQQQQAABi4DepLjqqquCBqr0f+p1CR/9RmywVNmNuGDnxvqY3qj4v//7P9ElXUgILFu2zLg5pzfo9J81cIUOAgi4W8BclktrqTNRTj311JAqrLN+dHZRvAJVOsvjvvvuMz5jnH4/6o3wevXq+dVdlwZ87rnnZPTo0X7HQt2hM3h0Jsl//vMfx0CV/j7X3+vRWJJPg1saMHQKVOkScE5LC+oXSm6//XZjRk6obUqkfNrfOqNK+zJQUhddWlmX5LMn/RLR/fffbwSu7MdCea2/09TXPltQz9X3nVNwV/tP+1H7M5Skyzzqv2BJZ9I59b+eo0v2/e53vzOWpgxWhtMxHVsaiNNZT05Jr+uUNBCsX1wKNvvLfp7OBNSlMjVp8Pmll16yZ+E1AggggAAC1SLAzKpqYeeiCCCAAAIIIBArAf1W6s0332w8n8p6jWbNmsl5550n/fv3N5b2M5+hof+Trjc+9JkXejNDb8Tot6mrIx1++OGiyxTZk94g0ueA6PMy9Ju71hsSekxvcuoNuuqeAWavN68RSBWBnJwcuffee0Nu7ttvv208A8U8QWdJnnXWWebLSn/q9UjJI6A34a2/d3Tpssq+SFFdrddZvXpz3Jp0+VxdAk9nLpk3wPX36nfffSevv/66MSPYzK/BNX3Ok94sDyfp70D9csa6det8Tuvdu7eceeaZ0qdPH+O5O+ZzinSJMzXVmcjqq0vHhZo0r85Y0SUAzaT9ocu86QyUtm3bGsEYDYSMGzdO3nrrLdEZX5r0d7IGHLSdqTBOdYnFgQMHGksn62xeM6Cizw7Tv6n0eUhfffWVT3BRZ+Tp80HVMtT0888/G67qayb90pHO1NMlBvX5TRrs0aUpP/zwQ/nggw+8s/60H//0pz8Zs9/CXYZQA5MnnHCC6PtMl4fWZ1TpPk36xSidZfavf/3LeJ6VWS99ptWwYcOM90WLFi3M3ZX+HDt2rN/Y0udv6SwoXZZal7DU59nl5+cbsxoXLFjgLVOXDXzqqaeM1QC8O4NsaBv0701dQUCTzuzUcrWNJAQQQAABBKpTgGBVdepzbQQQQAABBBCIusC7774rOsPEms4++2zRpWeclkjRb//qzQu90aX/9KaAPiR8/PjxxjMqrOXEeltv8lV280afSfLXv/7VeAaXtT56k0KfY6VBORICVRHQ5b1I4QnouKts7FpL/Prrr32CVXpTO5zzrWWxnfgC+mwaM+nNfv2dFUnSgEr79u2N5bw
"text/plain": [
"<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=1707x1101>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"from PIL import Image\n",
"import io\n",
"from IPython.display import display\n",
"\n",
"api_response = gpt_assistant._openai_client.files.with_raw_response.retrieve_content(\"file-TZbG4mSETWOeDqpidQsJCYVs\")\n",
"\n",
"if api_response.status_code == 200:\n",
" content = api_response.content\n",
" image_data_bytes = io.BytesIO(content)\n",
" image = Image.open(image_data_bytes)\n",
" display(image)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
}