mirror of
https://github.com/microsoft/autogen.git
synced 2025-11-02 02:40:21 +00:00
Add RAG gptassistant example notebook (#694)
* Add RAG gptassistant example notebook * add notebook links * Update notebook md --------- Co-authored-by: Qingyun Wu <qingyun.wu@psu.edu>
This commit is contained in:
parent
4e3723dd54
commit
fe0092516b
149
notebook/agentchat_oai_assistant_retrieval.ipynb
Normal file
149
notebook/agentchat_oai_assistant_retrieval.ipynb
Normal file
@ -0,0 +1,149 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## RAG OpenAI Assistants in AutoGen\n",
|
||||
"\n",
|
||||
"This notebook shows an example of the [`GPTAssistantAgent`](https://github.com/microsoft/autogen/blob/main/autogen/agentchat/contrib/gpt_assistant_agent.py#L16C43-L16C43) with retrieval augmented generation. `GPTAssistantAgent` is an experimental AutoGen agent class that leverages the [OpenAI Assistant API](https://platform.openai.com/docs/assistants/overview) for conversational capabilities, working with\n",
|
||||
"`UserProxyAgent` in AutoGen."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stderr",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"assistant_id was None, creating a new assistant\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"\u001b[33muser_proxy\u001b[0m (to assistant):\n",
|
||||
"\n",
|
||||
"What is the name of the class of agents I gave you?\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"\u001b[33massistant\u001b[0m (to user_proxy):\n",
|
||||
"\n",
|
||||
"The class of agents provided in the file is called `ConversableAgent`.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"\u001b[33muser_proxy\u001b[0m (to assistant):\n",
|
||||
"\n",
|
||||
"What does it do?\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"\u001b[33massistant\u001b[0m (to user_proxy):\n",
|
||||
"\n",
|
||||
"The `ConversableAgent` class is designed as a generic conversable agent that can be configured to act either as an assistant or user proxy. When it receives a message, it automatically generates and sends a reply unless the message is a termination message. It features a method to initiate a chat with another agent and can have its auto-reply behavior adjusted by overriding the `generate_reply` method.\n",
|
||||
"\n",
|
||||
"Here are some specific functionalities and mechanisms included in the `ConversableAgent` class:\n",
|
||||
"\n",
|
||||
"- It can process received messages and decide whether or not a reply is necessary or requested.\n",
|
||||
"- It can reset consecutive auto-reply counters and clear chat history when starting a new conversation.\n",
|
||||
"- It allows initiating chats either synchronously or asynchronously with the ability to pass context information and control chattiness.\n",
|
||||
"- It has the ability to register reply functions with specific triggers and order them based on preference.\n",
|
||||
"- The class supports setting a maximum number of consecutive auto-replies, after which it can prompt for human input based on configured criteria (e.g., always, on termination, or never).\n",
|
||||
"- The auto-reply trigger mechanism can be finely controlled by associating reply functions with triggers such as instances of particular classes, specific strings, or custom callable conditions.\n",
|
||||
"\n",
|
||||
"This class provides an extensible framework for creating bots or agents that can interact in a chat-like context, with custom behavior that developers can tailor to specific applications.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"\u001b[31m\n",
|
||||
">>>>>>>> NO HUMAN INPUT RECEIVED.\u001b[0m\n",
|
||||
"\u001b[31m\n",
|
||||
">>>>>>>> USING AUTO REPLY...\u001b[0m\n",
|
||||
"\u001b[33muser_proxy\u001b[0m (to assistant):\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n",
|
||||
"\u001b[33massistant\u001b[0m (to user_proxy):\n",
|
||||
"\n",
|
||||
"It seems that your request was incomplete. Could you please provide more information or clarify your request?\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"--------------------------------------------------------------------------------\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "stderr",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Permanently deleting assistant...\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"import logging\n",
|
||||
"import os\n",
|
||||
" \n",
|
||||
"from autogen import config_list_from_json\n",
|
||||
"from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent\n",
|
||||
"from autogen import UserProxyAgent\n",
|
||||
"\n",
|
||||
"logger = logging.getLogger(__name__)\n",
|
||||
"logger.setLevel(logging.WARNING)\n",
|
||||
"\n",
|
||||
"assistant_id = os.environ.get(\"ASSISTANT_ID\", None)\n",
|
||||
"\n",
|
||||
"config_list = config_list_from_json(\"OAI_CONFIG_LIST\")\n",
|
||||
"llm_config = {\n",
|
||||
" \"config_list\": config_list,\n",
|
||||
" \"assistant_id\": assistant_id,\n",
|
||||
" \"tools\": [\n",
|
||||
" {\n",
|
||||
" \"type\": \"retrieval\"\n",
|
||||
" }\n",
|
||||
" ],\n",
|
||||
" \"file_ids\": [\"file-CmlT0YKLB3ZCdHmslF9FOv69\"] \n",
|
||||
" # add id of an existing file our openai account\n",
|
||||
" # in this case I added the implementation of conversable_agent.py\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"gpt_assistant = GPTAssistantAgent(name=\"assistant\",\n",
|
||||
" instructions=\"You are adapt at question answering\",\n",
|
||||
" llm_config=llm_config)\n",
|
||||
"\n",
|
||||
"user_proxy = UserProxyAgent(name=\"user_proxy\",\n",
|
||||
" code_execution_config=False,\n",
|
||||
" is_termination_msg=lambda msg: \"TERMINATE\" in msg[\"content\"],\n",
|
||||
" human_input_mode=\"ALWAYS\")\n",
|
||||
"user_proxy.initiate_chat(gpt_assistant, message=\"What is the name of the class of agents I gave you?\")\n",
|
||||
"\n",
|
||||
"gpt_assistant.delete_assistant()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.12"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@ -44,4 +44,5 @@ Links to notebook examples:
|
||||
- Hello-World Chat with OpenAi Assistant in AutoGen - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_twoagents_basic.ipynb)
|
||||
- Chat with OpenAI Assistant using Function Call - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_function_call.ipynb)
|
||||
- Chat with OpenAI Assistant with Code Interpreter - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_code_interpreter.ipynb)
|
||||
- Chat with OpenAI Assistant with Retrieval Augmentation - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_retrieval.ipynb)
|
||||
- OpenAI Assistant in a Group Chat - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_groupchat.ipynb)
|
||||
|
||||
@ -118,6 +118,7 @@ The figure below shows six examples of applications built using AutoGen.
|
||||
- Hello-World Chat with OpenAi Assistant in AutoGen - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_twoagents_basic.ipynb)
|
||||
- Chat with OpenAI Assistant using Function Call - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_function_call.ipynb)
|
||||
- Chat with OpenAI Assistant with Code Interpreter - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_code_interpreter.ipynb)
|
||||
- Chat with OpenAI Assistant with Retrieval Augmentation - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_retrieval.ipynb)
|
||||
- OpenAI Assistant in a Group Chat - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_oai_assistant_groupchat.ipynb)
|
||||
|
||||
## For Further Reading
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user