autogen/notebook/agentchat_oai_code_interpreter.ipynb

321 lines
178 KiB
Plaintext
Raw Normal View History

{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Auto Generated Agent Chat: GPTAssistant with Code Interpreter\n",
"The latest released Assistants API by OpenAI allows users to build AI assistants within their own applications. The Assistants API currently supports three types of tools: Code Interpreter, Retrieval, and Function calling. In this notebook, we demonstrate how to enable `GPTAssistantAgent` to use code interpreter. \n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install:\n",
"```bash\n",
"pip install pyautogen\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import io\n",
"\n",
"from IPython.display import display\n",
"from PIL import Image\n",
"\n",
"import autogen\n",
"from autogen.agentchat import AssistantAgent, UserProxyAgent\n",
"from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent\n",
"\n",
"config_list = autogen.config_list_from_json(\n",
" \"OAI_CONFIG_LIST\",\n",
" file_location=\".\",\n",
" filter_dict={\n",
" \"model\": [\"gpt-3.5-turbo\", \"gpt-35-turbo\", \"gpt-4\", \"gpt4\", \"gpt-4-32k\", \"gpt-4-turbo\"],\n",
" },\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"It first looks for environment variable \"OAI_CONFIG_LIST\" which needs to be a valid json string. If that variable is not found, it then looks for a json file named \"OAI_CONFIG_LIST\". It filters the configs by models (you can filter by other keys as well).\n",
"\n",
"The config list looks like the following:\n",
"```python\n",
"config_list = [\n",
" {\n",
" \"model\": \"gpt-4\",\n",
" \"api_key\": \"<your OpenAI API key>\",\n",
" }, # OpenAI API endpoint for gpt-4\n",
"]\n",
"```\n",
"\n",
"Currently Azure OpenAi does not support assistant api. You can set the value of config_list in any way you prefer. Please refer to this [notebook](https://github.com/microsoft/autogen/blob/main/website/docs/llm_endpoint_configuration.ipynb) for full code examples of the different methods."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Perform Tasks Using Code Interpreter\n",
"\n",
"We demonstrate task solving using `GPTAssistantAgent` with code interpreter. Pass `code_interpreter` in `tools` parameter to enable `GPTAssistantAgent` with code interpreter. It will write code and automatically execute it in a sandbox. The agent will receive the results from the sandbox environment and act accordingly.\n",
"\n",
"### Example 1: Math Problem Solving\n",
"In this example, we demonstrate how to use code interpreter to solve math problems.\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"OpenAI client config of GPTAssistantAgent(Coder Assistant) - model: gpt-4-turbo\n",
"Matching assistant found, using the first matching assistant: {'id': 'asst_xBMxObFj0TzDex04NAKbBCmP', 'created_at': 1710321320, 'description': None, 'file_ids': [], 'instructions': 'You are an expert at solving math questions. Write code and run it to solve math problems. Reply TERMINATE when the task is solved and there is no problem.', 'metadata': {}, 'model': 'gpt-4-turbo', 'name': 'Coder Assistant', 'object': 'assistant', 'tools': [ToolCodeInterpreter(type='code_interpreter')]}\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33muser_proxy\u001b[0m (to Coder Assistant):\n",
"\n",
"If $725x + 727y = 1500$ and $729x+ 731y = 1508$, what is the value of $x - y$ ?\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mCoder Assistant\u001b[0m (to user_proxy):\n",
"\n",
"The value of \\( x - y \\) is \\(-48\\).\n",
"\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33muser_proxy\u001b[0m (to Coder Assistant):\n",
"\n",
"\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mCoder Assistant\u001b[0m (to user_proxy):\n",
"\n",
"It seems you have no further inquiries. If you have more questions in the future, feel free to ask. Goodbye!\n",
"\n",
"TERMINATE\n",
"\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Permanently deleting assistant...\n"
]
}
],
"source": [
"# Initiate an agent equipped with code interpreter\n",
"gpt_assistant = GPTAssistantAgent(\n",
" name=\"Coder Assistant\",\n",
" llm_config={\n",
" \"config_list\": config_list,\n",
" },\n",
" assistant_config={\n",
" \"tools\": [{\"type\": \"code_interpreter\"}],\n",
" },\n",
" instructions=\"You are an expert at solving math questions. Write code and run it to solve math problems. Reply TERMINATE when the task is solved and there is no problem.\",\n",
")\n",
"\n",
"user_proxy = UserProxyAgent(\n",
" name=\"user_proxy\",\n",
" is_termination_msg=lambda msg: \"TERMINATE\" in msg[\"content\"],\n",
" code_execution_config={\n",
" \"work_dir\": \"coding\",\n",
" \"use_docker\": False, # Please set use_docker=True if docker is available to run the generated code. Using docker is safer than running the generated code directly.\n",
" },\n",
" human_input_mode=\"NEVER\",\n",
")\n",
"\n",
"# When all is set, initiate the chat.\n",
"user_proxy.initiate_chat(\n",
" gpt_assistant, message=\"If $725x + 727y = 1500$ and $729x+ 731y = 1508$, what is the value of $x - y$ ?\"\n",
")\n",
"gpt_assistant.delete_assistant()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Example 2: Plotting with Code Interpreter\n",
"\n",
"Code Interpreter can outputs files, such as generating image diagrams. In this example, we demonstrate how to draw figures and download it."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"OpenAI client config of GPTAssistantAgent(Coder Assistant) - model: gpt-4-turbo\n",
"No matching assistant found, creating a new assistant\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33muser_proxy\u001b[0m (to Coder Assistant):\n",
"\n",
"Draw a line chart to show the population trend in US. Show how you solved it with code.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mCoder Assistant\u001b[0m (to user_proxy):\n",
"\n",
"To draw a line chart showing the population trend in the US, we first need to obtain the data that contains the population figures over a range of years. As I don't have access to the internet in this environment, I cannot download the data directly. However, if you can provide the data, I can proceed to create a line chart for you.\n",
"\n",
"For the purpose of this demonstration, let's assume we have some hypothetical US population data for a few years. I'll generate some sample data and create a line chart using the `matplotlib` library in Python.\n",
"\n",
"Here's how we can do it:\n",
"\n",
"\n",
"Received file id=assistant-tvLtfOn6uAJ9kxmnxgK2OXID\n",
"\n",
"Here is a line chart that illustrates the hypothetical US population trend from 2010 to 2020. The data used here is for demonstration purposes only. If you have actual population data, you can provide it, and I will update the chart accordingly.\n",
"\n",
"TERMINATE\n",
"\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
},
{
"data": {
"text/plain": [
"ChatResult(chat_id=None, chat_history=[{'content': 'Draw a line chart to show the population trend in US. Show how you solved it with code.', 'role': 'assistant'}, {'content': \"To draw a line chart showing the population trend in the US, we first need to obtain the data that contains the population figures over a range of years. As I don't have access to the internet in this environment, I cannot download the data directly. However, if you can provide the data, I can proceed to create a line chart for you.\\n\\nFor the purpose of this demonstration, let's assume we have some hypothetical US population data for a few years. I'll generate some sample data and create a line chart using the `matplotlib` library in Python.\\n\\nHere's how we can do it:\\n\\n\\nReceived file id=assistant-tvLtfOn6uAJ9kxmnxgK2OXID\\n\\nHere is a line chart that illustrates the hypothetical US population trend from 2010 to 2020. The data used here is for demonstration purposes only. If you have actual population data, you can provide it, and I will update the chart accordingly.\\n\\nTERMINATE\\n\", 'role': 'user'}], summary=\"To draw a line chart showing the population trend in the US, we first need to obtain the data that contains the population figures over a range of years. As I don't have access to the internet in this environment, I cannot download the data directly. However, if you can provide the data, I can proceed to create a line chart for you.\\n\\nFor the purpose of this demonstration, let's assume we have some hypothetical US population data for a few years. I'll generate some sample data and create a line chart using the `matplotlib` library in Python.\\n\\nHere's how we can do it:\\n\\n\\nReceived file id=assistant-tvLtfOn6uAJ9kxmnxgK2OXID\\n\\nHere is a line chart that illustrates the hypothetical US population trend from 2010 to 2020. The data used here is for demonstration purposes only. If you have actual population data, you can provide it, and I will update the chart accordingly.\\n\\n\\n\", cost=({'total_cost': 0}, {'total_cost': 0}), human_input=[])"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"gpt_assistant = GPTAssistantAgent(\n",
" name=\"Coder Assistant\",\n",
" llm_config={\n",
" \"config_list\": config_list,\n",
" },\n",
" assistant_config={\n",
" \"tools\": [{\"type\": \"code_interpreter\"}],\n",
" },\n",
" instructions=\"You are an expert at writing python code to solve problems. Reply TERMINATE when the task is solved and there is no problem.\",\n",
")\n",
"\n",
"user_proxy.initiate_chat(\n",
" gpt_assistant,\n",
" message=\"Draw a line chart to show the population trend in US. Show how you solved it with code.\",\n",
" is_termination_msg=lambda msg: \"TERMINATE\" in msg[\"content\"],\n",
" human_input_mode=\"NEVER\",\n",
" clear_history=True,\n",
" max_consecutive_auto_reply=1,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we have the file id. We can download and display it."
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAB7sAAAPUCAYAAADc4V37AAEAAElEQVR4AezdB3gcxdnA8VeSi9zlLle5y8YGY8s2oROC6T30ADZgEpKQQgokAT4IqaSRhJBAsAETIPQOCRAgdGJbLhiDJbnJVe5yl4ukb98ld9lb3a3uTldmb//7PObKzu7O/ObVSdy7M5PXaG3ChgACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAgI8E8n1UV6qKAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIICALUCym0BAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEPCdAMlu33UZFUYAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQINlNDCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAII+E6AZLfvuowKI4AAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAgiQ7CYGEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQR8J0Cy23ddRoURQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBEh2EwMIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAr4TINntuy6jwggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACJLuJAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAAB3wmQ7PZdl1FhBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAGS3cQAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggIDvBEh2+67LqDACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAMluYgABBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAwHcCJLt912VUGAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEECAZDcxgAACCCCAAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCDgOwGS3b7rMiqMAAIIIIAAAggggAACCCCAAAIIIIAAAggggAACCCCAAAIIIECymxhAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEPCdAMlu33UZFUYAAQQQQAABBBBAAAEEEEAAAQQQQAABBBBAAAEEEEAAAQQQaAUBAggggAACCCCAAAJBF2hsbJTq6mqpqqqSbdu2yY4dO2ySdu3aSefOnaVPnz7Sr18/6dWrV9CpaL+HwBlnnCHr1q0Ll3j++eelb9++4de58OTLX/6yzJ07N9yUu+++WyZMmBB+zRMEclVgzpw5cs0114SbN378ePnrX/8afp3uJzU1NXLeeedJXV2dfanvfe97ctFFF8V92YaGBlmzZo0sXbpUNm7cKDt37pRWrVrZv+N69+4tY8aMkY4dO8Z9vkQK6rUWLFggq1atkl27dkmbNm2kZ8+ectBBB8nAgQMTOZWxZZcsWSKLFy+WzZs3S319vXTq1ElKSkrkkEMOkcLCQmPrne6K1dbW2jGnfa9/W+3fv9+OuaKiIhk5cqT0798/LVXQeP/0009F+2Xr1q2if+d16dJFhgwZYse6xn4qtlz7uXr11VflRz/6kU1TUFAgjzzyiAwdOjQVVJwDAQQQQAABBBBIq0Bq/rpLaxU5OQIIIIAAAggggEAuCKxdu1bOPPPMiKa0NFF26623yosvvhg+Z6LJh08++USeeuopef311+0v/sMnivGkW7duMmrUKCkrK5MjjjhChg0bFqNky992ty3WGfXLSE1Q6BfrgwYNktGjR8tRRx1l1zPWMbyPAAKZE3AnSdN95ZZ+rqa7fpw/OYHf/va34US33nz1xS9+sdkT6c03//73v2XWrFn2TSqaaI615efn2783zj//fDnxxBPthHSssvG+X1lZKffee6+88847cuDAgaiHafLxsssuk9NPP13y8vKilonnTU06Llu2TPT3+qJFi+xHvYHNeV29ce2FF16I53RxldFzP/bYY/Y//Rsn2ta2bVvb8+qrr865m5+itVdvxnj//fflgw8+kNmzZ8vq1aujFQu/1717d7vvNe6Ki4vD7yf7ZPfu3fK3v/3N/ttuy5YtUU+jfzPpzWlXXnmldO3aNWoZrzdz+edq8uTJ8uCDD9o3buhNG7/5zW/kL3/5ixcH+xBAAAEEEEAAASMESHYb0Q1UAgEEEEAAAQQQQCCTAjq6SBMHzkR5PNfXL07fe+89+98f//hHufHGG+Wcc86J59C0ldEvI3U0uv7TL5Xfffddueeee+Tggw+Wb33rW3LooYem7dqc2D8CFRUVdtIrVOPS0lI57rjjQi95RAABgwU0Wf3mm2+Ga6gjzFu3bh1+7X6yYsUK+fGPfywLFy5074r5WpPFmiTWfzNnzpTbbrutRTdN3X///aI3XujvKK9NE9RaV/19/Mtf/jKh5KOOUNeRp1pnHcW7Z88er0uldJ/+vtXR9Tpy2Gvbu3evnWDXEbM//OEP7cSuV3k/7/vZz34mr7zyimjCOd5NR8JrvD366KPy9a9/XS6++OKkb3rQGx2uv/560VkQvDadaeDvf/+7vPzyy/KTn/zEvnnRq3xoXxB+rvSGE+2Hb3zjG3az9YYFvSH0C1/4QoiBRwQQQAABBBBAwEgB1uw2sluoFAIIIIAAAggggEC6BPSL1alTp3omunX6cp3uUkdNe23NfYnvdWy692mS4ytf+Yo88MAD6b4U5/eBQGiEpY6y1H862pMNAQT8IfCnP/0pXFEd1a0jr702HWXcXKK7Q4cO0r59+6inWb58uVxxxRX2jV1RCzTz5h/+8Ae56667oia69bo6ity9lZeXiy6ToNNex7vp8iM6ileXVsh0ovuqq66KmujWvxv0bwj3pklvnbHliSeecO/Kmddvv/22Z6Jbpw7Xv62i9b/6/O53v5NbbrnFnnI8UZSPP/7YXmYgWqJbp83XEfbuTW8SvO666+Stt95y74r6Oig/V4cffnjEjS46stvkv3ejdhZvIoAAAggggEDgBBjZHbgup8EIIIAAAggggEBwBXTKUR2tol+QOzddO/Tss8+WI4880p6aPLS+pq7xqF+G6jqc+kWqfpGrI4eysem6ppdcckmTS+sXkJoc0FFtOgJHvzAObbpPkyT65XK2R6CH6sQjAkET0HV7f/CDH8TdbB2punLlynB5naXhtNNOC79u7olejy13BPTGFOfvHZ3yu7kbsdyt1/KHHXaYaBJLl+HQ3yeh33M604mOHH/44Yflo48+Ch+qvy91lOyMGTPstZXDO5p58o9//MNOQDuL9e3b154yWkeH6pIbum6zjsbWm7F0NpLQpkn2m266Se68886kR/eGzpWuR/0dqyO69cY556bTsOuoZF3fWJO6mzZtErXQUcvOBL5OC61LoIwbN855eE4+18+iz3/+83bM6RIwuk63bvq3if7NoqP5n3vuOTse7B3Wf3S0tf5NFhpZHHrf61HX5NY+cY4o15i/4IIL7HXuBwwYYMeTJsL1ehrroZsjtC4333yzHbOJfnbm8s/VlClTwr+3dES79otO/c6GAAIIIIAAAgiYKkCy29SeoV4IIIAAAggggAACKRfQaTJ1hKtz0y+ob7jhhqgjsXQ6R/1y9nOf+5z9b9q0aaJrNT7//PP2epDO86T7eY8ePewvbb2uo1+u//rXv7annHSW0y/XdR1v/QKZDYGWCPz1r39tyeGBPFZ/7s4777y4267THTuT3ZqASeT4uC9EQV8IaLI5tOlaw/o7K95Nb3TShJ/ezNW7d++oh2nyWZPQxx9/vJ3Y1qnHQ5smdn/1q1/JfffdF3rL81ETiHfccUdEGV0yQW+6cq6NrFOw6xIbv//97+0R4DrleWj78MMP7d9hJ5xwQuituB+1LZpUHT16tBx00EF2Ej/VI6l1jW7n1OX6d8KPfvSjJje
"text/plain": [
"<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=1979x980>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"api_response = gpt_assistant.openai_client.files.with_raw_response.retrieve_content(\n",
" \"assistant-tvLtfOn6uAJ9kxmnxgK2OXID\"\n",
")\n",
"\n",
"if api_response.status_code == 200:\n",
" content = api_response.content\n",
" image_data_bytes = io.BytesIO(content)\n",
" image = Image.open(image_data_bytes)\n",
" display(image)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Permanently deleting assistant...\n"
]
}
],
"source": [
"gpt_assistant.delete_assistant()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
}
},
"nbformat": 4,
"nbformat_minor": 2
}