2023-11-23 07:36:52 +11:00
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_graph_modelling_language_using_select_speaker.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Auto Generated Agent Chat: Graph Modeling Language with using select_speaker\n",
"\n",
"AutoGen offers conversable agents powered by LLM, tool, or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation.\n",
"Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"This notebook is about using graphs to define the transition paths amongst speakers.\n",
"\n",
"Benefits\n",
"- This contribution fills the gap between the current modes of GroupChat Class (auto, manual, round_robin) and an expressive directed graph. See Motivation for more detailed discussion.\n",
"\n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install:\n",
"```bash\n",
"pip install pyautogen\n",
"```"
]
},
{
"cell_type": "code",
2024-01-08 04:47:01 +01:00
"execution_count": null,
2023-11-23 07:36:52 +11:00
"metadata": {},
"outputs": [],
"source": [
"%%capture --no-stderr\n",
2024-01-07 02:41:33 +01:00
"# %pip install \"pyautogen>=0.2.3\"\n",
2023-11-23 07:36:52 +11:00
"%pip install networkX~=3.2.1\n",
"%pip install matplotlib~=3.8.1"
]
},
2024-01-08 04:47:01 +01:00
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"import random # noqa E402\n",
"\n",
"import matplotlib.pyplot as plt # noqa E402\n",
"import networkx as nx # noqa E402\n",
"\n",
"import autogen # noqa E402\n",
"from autogen.agentchat.assistant_agent import AssistantAgent # noqa E402\n",
"from autogen.agentchat.groupchat import GroupChat # noqa E402"
]
},
2023-11-23 07:36:52 +11:00
{
"cell_type": "code",
2024-01-07 02:41:33 +01:00
"execution_count": 3,
2023-11-23 07:36:52 +11:00
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
2024-01-07 02:41:33 +01:00
"0.2.3\n"
2023-11-23 07:36:52 +11:00
]
}
],
"source": [
"print(autogen.__version__)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Motivation\n",
"\n",
"\n",
"The current GroupChat class allows transition to any agent (without or without the decision of LLM), some use case might demand for more control over transition. A graph is a possible way to control the transition paths, where each node represents an agent and each directed edge represent possible transition path. Let's illustrate the current transition paths for a GroupChat with five agents."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAApQAAAHzCAYAAACe1o1DAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/SrBM8AAAACXBIWXMAAA9hAAAPYQGoP6dpAACxHklEQVR4nOzddVhU6dsH8O8MA1JSAgIKiKCCiR1rrLXmmgi2q2t3roGCjS1rd7OKiL3W2q3YgoqASIg0Qw0DDGfeP3zhZwCizPBM3J/r4nI4+UWYOfd5zjnPw5NKpVIQQgghhBDyk/isAxBCCCGEEOVGBSUhhBBCCCkVKigJIYQQQkipUEFJCCGEEEJKhQpKQgghhBBSKlRQEkIIIYSQUqGCkhBCCCGElAoVlIQQQgghpFSooCSEEEIIIaVCBSUhhBBCCCkVKigJIYQQQkipUEFJCCGEEEJKhQpKQgghhBBSKlRQEkIIIYSQUqGCkhBCCCGElAoVlIQQQgghpFSooCSEEEIIIaVCBSUhhBBCCCkVKigJIYQQQkipUEFJCCGEEEJKhQpKQgghhBBSKlRQEkIIIYSQUqGCkhBCCCGElAoVlIQQQgghpFSooCSEEEIIIaVCBSUhhBBCCCkVKigJIYQQQkipUEFJCCGEEEJKhQpKQgghhBBSKlRQEkIIIYSQUqGCkhBCCCGElAoVlIQQQgghpFSooCSEEEIIIaVCBSUhhBBCCCkVKigJIYQQQkipUEFJCCGEEEJKhQpKQgghhBBSKgLWAQghhBBStMxsCd4nZSJHwkFLwEeVCnrQK0eHb6JY6C+SEEIIUTAhcenweRCJa8HxiEwWQfrZPB4AGxNdtK1hjkFNbVCtYnlWMQkpwJNKpdLvL0aIfNEZOCGEAFHJIsw78RK3QhOhwechjyv6EJ0/v5WDKZb3rgNrE90yTErIl6igJMzQGTghhPzPkYBIeJ4OgoSTFltIfk2Dz4OAz8OiHrXQv7GNHBMSUjQqKEmZozNwQgj50qZrIVhz6W2ptzPzt+qY2LaaDBIR8mOooCRlis7ACSHkS0cCIjHn+EuZbW9lnzpwo89JUsaooCRlhs7ACSHkS1HJInRYfwPZEu6L6Tnx75H24BiyY0ORl5ECaa4Y/HJ60DKvAv26v0Gv1q9FbrOcgI/L09rQFR1SpqgfSlImjgREyqSYBIA1l97CNyBSJtsihBCW5p14CUkhV2ty4sORGXQdkqRoSLMzAS4PXFYaxBEvkHhmDVLvHS1ymxJOinknZNfiSUhJ0GO0RO6ikkXwPB1U5HypJBdpD08gM+gacoWx4Gtqo5x1LRj+0h/lLBwKXcfjdBBa2JvSGTghRGmFxKXjVmhiofM0dPShX68TylnXhoa+MThxBtIDTiL7wxsAQPqjMzBs7lrounmcFLdCExEanw4Hc3qgkZQNaqEkclfUGTgASLk8xPsthPDmAeQmRQF5ueDE6cgKuY/Yg7OQ9f5ZoevRGTghRNEdOXIEixYtQlJSUqHzfR5EQoPPK3Sejn1jVOgyCfq120KnijP0HFvC5LfxBfO5nKxi963B5+HQfbqSQ8oOFZRErvLPwIt6ACf9yb8QRzwHAGia2cKs9zwYtnD7NDMvF0n/ekMqyf1mvc/PwAkhRBHt2rULCxcuhLW1NWbPno2EhIQv5l8Lji/Rw4lSKQdJehLSn50vmKZtU6fYdfI4Ka69jf+54IT8BCooiVwVdwYOABlP//cBWaHzJOjWaAGj1kOgbdcAAJCXnghR6MNC1y3qDDw5ORnu7u7o2bMnOI4rZE1CCJE/Pv/TITYrKwtr166FjY0NZsyYgdjYWGRkSxCZLPruNj4emIHIlT3wYfOw//+85H1qvew65bvrRiaJkJktKe2PQUiJ0D2URK6KOwPPy0r/dJkbAPgCaFn+78ntcpWcIA5/AgDIjg6CnuMv367//2fgC1ELACAUCrF+/XqsWbMGItGnD+qcnBxoa2vL8kciakIqlYLjOOTl5X3xJZFIvnj99TKfT/v8NcdxBet+vc7n33++/OfTilrm6+W/nlbU1/fmcxwHqVRasFz+/8fX84t7XdR8AN8sV9T8/OlfT/v8K3/+16+L+ips/ufT8l//yL+Fvf78+/zf3bp167Bu3TroVa4B08Frf/wPk8cD+BpACTpokQJ4n5SJWlaGP74fQn4QFZREbr53Bi5JjSt4raFTHjy+xv++1/vfB6BEGIeiRCaJEPzuPXZt3YwtW7ZALBZ/0SoZGBgITU3Nb4qCwgqF4g7mhb3OP9iW5GBe2Pefr//1wfrzaYXN//xg/PX84g7mRU0r6vXXB/DCDujfO9h/70Be2PTPv89/Xdy0fN87yBf2Pfk5PN63Vx4+n5b/+mf+/fp1Yd9/Pj2/JZDH433xOv+Lz+cXum5h8/K//3z616+/Xubzfz9//fTpUyQmfvnQDZ/Ph6OjI7oPmwjflO//P1foPBGcOAOStERkPD2H7A+vkRVyH/HpSbD8Y/1318+R0FUaUjaooCRyE5GUieIO3dJc8f++0fjyT5HHFxS+3NfbAFCneTvkxocXOr9x48Yliar0vj64F3Zg//z1jx7UP59WkoN8SQ7kxS1T3MG8pAf4773W0NAodNrXy37+lT+/sOmfr//1tPzpX8/P/zd/WYFAUDBfIBAUrJ//WiAQfLOtr6flb+Pz+Z9Py3/9+b9fTyOy8fvvv+Ps2bMFv+Px48djzpw5qFixIoJiUuG78fZ3t6FlblfwWrdGc0T/PRBSSQ5yYkOQm/wBmiaVil9fQL9PUjaooCRy870zY57m/y5FS/O+fPBGykkKXa4wffu54dqxvYiLiwOPx/uiBWrWrFnQ0dEp9mD++bSvp3990P/89ecH7M+Lga8P6l8XA19/fX4w/3oaHdwJUV6GhoYQCAQYN24c5s6dC0tLy4J5VSrogQcUedLN5WaDr1mukDn/O0HkxBnF7p/3//shpCxQQUnk5ntnxgLDigWvuax0SLm8gsveeRn/uxYkMKr4zbqfmz9vDv7ZuBxXr16Fh4cH7t69W1BYzp8/HwYGBqX4KQgh5Ods2rQJ69evh5mZ2Tfz9MoJYGOii4gibguK3T8NWlY1oF25JjQMzMCJUpH+5F9IJdkAAJ6gHDQrWBe7f5sKutArR4d5UjboL43IzffOwDV0ykOzgvWnB3O4POR8fItylZwAANkxbwqWK1e5VpH7yD8D5/F4aN++Pdq3b48bN27A09MTz549Q7lyhZ3hE0KI/BkZGRU7v20Ncxx8EFHog4tcjhiZL/5D5ov/Cl3XuN0I8MsVPbCDBp+HttXNfygvIaVBBSWRm++dgQOAfv0uSLm8AwCQdH4jjFoNRnZcGMThTwEAGuVNoevQpMj1CzsDb9OmDa5fvw6pVFrogwOEEMJCTk4OUlNTkZqaio8fP+L95X+Rp9eq0GUNmvZGVuhD5CZGIU+UCkAKDf0KKFfJEeXrd4G2de1i95XHSTG4mY0cfgpCCkcFJZGr4s7AAaB8g27ICnkAccRz5CZGIuHE8v/N1NBEhW5TwRNoFrru987AqZgkhLB29uxZjBgxAmlpacjOzv5mfoMZNSAsZ468r245N2j4Owwa/v5T+9Tg89CiagUadpGUKbrjn8jVoKY2xY4EweNrwLzfQhi1HgpBhcqAhib42uWh49AUFkNWQ6eKc5Hr5nFSZDw/j9u3b+PZs2cICwtDXFwcMjMzqWsYQohCKF++PBISEgotJt3c3HBynhsEMn74TsDnYXnv4kfSIUTWeFI68hI5G7L7Ae6+SyrREGMlxQeQFfEMsYfnFzq/adOmuH//vsz2RwghP6tTp064dOlSwfc8Hg81a9bE06dPoampiSMBkZhz/KXM9reyTx24NabL3aRsUQslkbvlvetAUMzwiz9DU8CH78zeEAgKv2ujdu3i7y8ihBB5y8nJweDBg78oJoFPnZsfPHgQmpqfbufp39gGM3+rLpN9zvqtBhWThAkqKIncWZvoYlGPop/U/hmLe9RC6wY1sW/fvm/maWhoYPHixTLdHyGE/Ahvb28YGhrCx8cH9vb2GD58eEGn+XPmzEH
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Create an empty directed graph\n",
"graph = nx.DiGraph()\n",
"\n",
"# Add 5 nodes to the graph using a for loop\n",
"for node_id in range(5):\n",
" graph.add_node(node_id, label=str(node_id))\n",
"\n",
"# Add edges between all nodes using a nested for loop\n",
"for source_node in range(5):\n",
" for target_node in range(5):\n",
" if source_node != target_node: # To avoid self-loops\n",
" graph.add_edge(source_node, target_node)\n",
"\n",
2024-01-08 04:47:01 +01:00
"nx.draw(graph, with_labels=True, font_weight=\"bold\")"
2023-11-23 07:36:52 +11:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Possibly interesting transition paths"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAApQAAAHzCAYAAACe1o1DAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/SrBM8AAAACXBIWXMAAA9hAAAPYQGoP6dpAABbyUlEQVR4nO3deViU9f4+8JthU3BJUBRSEARJAXcC1FxYNAFBBMMdCZc0c+vY4jmZtlmd+FnSam65lYaZpphZoYkKKsrqhgsg4gDDgOzLMPz+qPzmccUZ+Mxyv67rXJcwM8/cdBRu3p/n+TwGjY2NjSAiIiIiekwS0QGIiIiISLuxUBIRERGRSlgoiYiIiEglLJREREREpBIWSiIiIiJSCQslEREREamEhZKIiIiIVMJCSUREREQqYaEkIiIiIpWwUBIRERGRSlgoiYiIiEglLJREREREpBIWSiIiIiJSCQslEREREamEhZKIiIiIVMJCSUREREQqYaEkIiIiIpWwUBIRERGRSlgoiYiIiEglLJREREREpBIWSiIiIiJSCQslEREREamEhZKIiIiIVMJCSUREREQqYaEkIiIiIpWwUBIRERGRSlgoiYiIiEglLJREREREpBIWSiIiIiJSCQslEREREamEhZKIiIiIVMJCSUREREQqYaEkIiIiIpWwUBIRERGRSlgoiYiIiEglLJREREREpBIWSiIiIiJSiZHoAERERER/q6xVILu4EnUKJUyMJOhuaQ5zU9YVTcf/h4iIiEiorIJybEvKRfzFQuTKq9D4j8cMANhamGGksxWmeNjCqXNbUTHpAQwaGxsbH/40IiIiIvW6Lq/Cst3pOHpZBkOJARqU968kfz/+jGNHvBfihm4WZi2YlB6GhZKIiIha3HencvHm3kwolI0PLJL/y1BiACOJAVYGuWCiu20zJqSmYKEkIiKiFvVpfBY++uWSysf516iemD/SSQ2JSFW8ypuIiIhazHenctVSJgHgo18uYcepXLUci1TDCSURERG1iOvyKviuPoJahfKuxxqqy1GWtAu1Ny6g7mYWGhW1AABzVx90DFx832OaGknw6+LhPKdSME4oiYiIqEUs250OxX3Ol2woK0JZYixqr2fcLpOPQqFsxLLd6eqKSI+JhZKIiIiaXVZBOY5elt3/AhxDI5h2c0U7zzCY9/F75OM2KBtx9LIMlwvL1ZSUHgcLJRERETW7bUm5MJQY3Pdxk4626DLlfXQYMQOm1k270MZQYoCtiTyXUiQWSiIiImp28RcLm7Q9UFM0KBsRf6mwWY5Nj4aFkoiIiJpVRa0CufKqZn2P3OIqVNYqmvU96P5YKImIiKhZ5RRXorm3lGkEkF1c2czvQvfDQklERETNqu4e2wRp8/vQ3VgoiYiIqFmZGLVM3Wip96G78b88ERERNavulua4//Xd6mHw1/uQGEaiAxAREZFuMzc1gq2FGXIecGGOsr4G1VdOAwDqCq7e/ryirBCVFxIAAKbWPWHU3uqer7e1NIO5KWuNKPwvT0RERM1upLMVtiTl3HfrIGXlLch+fP+uz9fmpqM298874Vj6L0KbPr53PcdQYoCRPe9dNKllcMmbiIiImt0UD9tm3YdyqqdtsxybHo1BY2Njc1/JT0RERIRp65Nw/GqxWoulocQAgx0ssSXKQ23HpKbjhJKIiIhahHe7QijqatV6TCOJAd4LcVPrManpWCiJiIioWTQ0NCApKQlvv/02OnXqhOefC0LXghNqfY+3glzQzcJMrcekpuOSNxEREalNY2MjNm/ejJ9++gm//PILysvLbz9mbGyMyspKfJWQjY9+uaTyey0d5YwXRzqqfBxSHa/yJiIiIrXJycnBjBkz7vnYm2++CWNjY8wf6YSObUzx5t5MKJSNTTqn0lBiACOJAd4KckG4Oy/E0RScUBIREZFa/fe//8Urr7xyx+cMDQ2Rl5eHLl263P7cdXkVlu1Ox9HLMhhKDB5YLP9+/BnHjngvxI3L3BqGhZKIiIjU6uLFi3Bzc0N9fT2AP8tkYGAgfvzxx3s+P6ugHNuSchF/qRC5xVX4ZzExwJ+blo/saYWpnrZwtGrb7Pmp6VgoiYiISG0SEhLg4+OD+vp6DBo0CKdPn0ZjYyMOHDiAZ5999qGvr6xVILu4EnUKJUyMJOhuac474GgBFkoiIiJSix07dmDy5MmQSCT4+eefMXToUIwaNQrXr19HVlYWDA0NRUekZsJCSURERCr76KOPsHTpUpiZmSEpKQmurq4AAKVSicrKSrRty6VqXcZCSURERCp56aWX8Omnn8LS0hIZGRl3XHhD+oEnJRAREdFjCwoKwk8//QQHBwekp6fDzIxXX+sjFkoiIiJqMoVCAU9PTyQnJ8PLywsJCQmQSHgDPn2l94WSV5MRERE1TXl5Odzc3JCTk4OwsDB8//33oiORYHrZnG7vd3WxELnye+x3ZWGGkc5WmOJhC6fOPImYiIjob3l5eejTpw9KSkqwZMkSREdHi45EGkCvLsrhjvxERESPLyUlBV5eXqipqcEnn3yCBQsWiI5EGkJvCuV3p3JVumfoyiAXTOQ9Q4mISE8dOHAAY8eORWNjI77//nuMHz9edCTSIHpRKD+Nz8JHv1xS+Tj/GtUT80c6qSERERGR9vj6668xZ84cmJiY4MiRI/Dw8BAdiTSMzl+O9d2pXLWUSQD46JdL2HEqVy3HIiIi0gZvvPEGZs+ejbZt2+LcuXMsk3RPOj2hvC6vgu/qI6hVKB/4vMLvV6D6yunbH9vM+gLGlt3u+VxTIwl+XTyc51QSEZHOmz59OrZs2QIbGxtkZmbiiSeeEB2JNJROTyiX7U6H4iHnS1Zkxt9RJh9GoWzEst3pqkYjIiLSWEqlEiNGjMCWLVvg6uqKa9eusUzSA+lsocwqKMfRy7IHXoDTUHULJb9+DcAAMHy0HZQalI04elmGy4XlakpKRESkOWpqauDi4oIjR45g1KhRSE1NhYmJiehYpOF0tlBuS8qFocTggc8p+e1rKKvL0KbfaBiaWzzysQ0lBtiayHMpiYhIt8hkMnTv3h0XLlxAVFQUDh48yLvf0CPR2b8l8RcLHzidrL6ajMrMwzBsY4EOIyKbdOwGZSPiLxWqGpGIiEhjXLp0Cfb29igoKMBbb72FdevWiY5EWkQn75RTUatArrzqvo8r66pR/PNnAACLUfMgaWXe5PfILa5CZa2Ct2kkIiKtl5CQAB8fH9TX12PTpk2IiIgQHYm0jE5OKHOKK/GgS3FKj2xGQ1khzJ4aCrOeno/1Ho0AsosrH+u1REREmmLHjh0YPnw4lEolDh06xDJJj0UnC2XdA7YJqi++jvIz+yFp1QYWfnOa7X2IiIg0XXR0NCZOnIhWrVrh7Nmz8PHxER2JtJROrteaGN2/JzdUlACNSihrKpAXM+2ez8n/ei6Mrexh83zMY78PERGRJluwYAFiYmJgaWmJ9PR0WFtbi45EWkwnC2V3S3MYAA9c9laVwV/vQ0REpG2Cg4Oxd+9eODg4ID09HWZmvFkHqUYnC6W5qRFsLcyQc48Lc4w62KCDz6y7Pn/r2LdQ1lQAANp5TYBxR9sHvoetpRkvyCEiIq2iUCjg6emJ5ORkeHp64tixY9wWiNRCZxvRSGcrbEnKuWvrIKN2HdHOPfiu55ed2gP8VSjbuHrf99aLwJ/7UI7saaXewERERM2ovLwcbm5uyMnJQVhYGL7//nvRkUiH6OyvJVM8bB+4D6UqGpSNmOr54AkmERGRpsjLy4OdnR1ycnKwZMkSlklSO4PGxsbmPNVQqGnrk3D8arFai2VjgwL1NzIxuPYMnJycYG1tfft/Li4u6NChg9rei4iISFUpKSnw8vJCTU0NPvnkEyxYsEB0JNJBOl0or8ur4Lv6CGrVuL2PQUM98ta+AMWtAhgY/Hlrx7//E7q7u+PkyZNqey8iIiJVHDhwAGPHjkVjYyO+//57jB8/XnQk0lE6u+QNAN0szLAyyEWtx3w3tB+6Wfx5NVxjYyP+2cenTp2q1vciIiJ6XF9//TUCAgJgZGSE48ePs0xSs9LpCeXfPo3Pwke
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Hub and Spoke\n",
"# Create an empty directed graph\n",
"graph = nx.DiGraph()\n",
"\n",
"# Add 5 nodes to the graph using a for loop\n",
"for node_id in range(5):\n",
" graph.add_node(node_id, label=str(node_id))\n",
"\n",
"# Add edges between all nodes using a nested for loop\n",
"for source_node in range(5):\n",
" target_node = 0\n",
" if source_node != target_node: # To avoid self-loops\n",
" graph.add_edge(source_node, target_node)\n",
" graph.add_edge(target_node, source_node)\n",
"\n",
"\n",
2024-01-08 04:47:01 +01:00
"nx.draw(graph, with_labels=True, font_weight=\"bold\")"
2023-11-23 07:36:52 +11:00
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAApQAAAHzCAYAAACe1o1DAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/SrBM8AAAACXBIWXMAAA9hAAAPYQGoP6dpAACcd0lEQVR4nOzdd3RU1drH8e+U9BBCAgECCaELoXelSRe9CmJBRUQF1GtHUbGh6CuKKIoiFoqKqCheEFBEikjvnYQSWkJPI71Oef+IDMQkkMqE8Pus5XJyzj57PyfeC8/sarDb7XZERERERIrJ6OwAREREROTqpoRSREREREpECaWIiIiIlIgSShEREREpESWUIiIiIlIiSihFREREpESUUIqIiIhIiSihFBEREZESUUIpIiIiIiWihFJERERESkQJpYiIiIiUiBJKERERESkRJZQiIiIiUiJKKEVERESkRJRQioiIiEiJKKEUERERkRJRQikiIiIiJaKEUkRERERKRAmliIiIiJSIEkoRERERKREllCIiIiJSIkooRURERKRElFCKiIiISIkooRQRERGRElFCKSIiIiIlooRSREREREpECaWIiIiIlIgSShEREREpESWUIiIiIlIiSihFREREpESUUIqIiIhIiSihFBEREZESUUIpIiIiIiWihFJERERESkQJpYiIiIiUiBJKERERESkRJZQiIiIiUiJmZwcg+UvNtHAsLpUsiw1Xs5EQfy+83PSfS0RERMofZSjlSMTZZL7fFMXKA9FExadhv+ieAQj286RH4wCGdAymYfVKzgpTREREJBeD3W63X76YlKXj8Wm8Mn8Paw7FYjIasNoK/k9y/n7XBlUZf3tzgvw8r2CkIiIiInkpoXSyOVuieGNhGBab/ZKJ5L+ZjAbMRgPjbgvlnvbBZRihiIiIyKUpoXSiKSsj+GDpwRLXM7pvI57s0bAUIhIREREpOq3ydpI5W6JKJZkE+GDpQX7aElUqdYmIiIgUlXooneB4fBq9P1pFpsWGLSuDlJ1LSDu4gezYKGzZGZi8/XCtGoxnk254NemC3ZLNub+/IevUASxJMdgyUzGYXXHxq4Vnoxuo1H4AHh4eLB/VXXMqRURE5IpTQukEQ2dsYv2RONKjI4n55S0sCWcKLFvzoU8wevhwcuqDBZZxr9uawHv/jxvq+fPd8I5lELGIiIhIwbRt0BUWcTaZNYdisaYnE/3zG1iTYgAwefvh0/EOXKrVwZ6VTkbUXlL2LAfAYDLh2egG3Ou2xlw5AOx2UvetIXXvCgAyju4gI+Y4a2x2DkUn0yBAWwqJiIjIlaOE8gr7flMUJqOBc5vnOZJJg5sXNYZNwlypqqOcZ6PrqXz9XWA0YfKoRLVBr+Sqx6N+O9IjNmLLTAXAlpWOyWhg9sYo3rwt9Mq9kIiIiFzztCjnClt5IBqrzU7avjWOaz7tB+RKJs8zefli8sjb22jLSCF515+OZNLo6YtL1WCsNjsrD0aXXfAiIiIi+VAP5RWUkmkhKj4NW1Z6rnmTbrUL16N47u9vSNr4S65rLtVC8O//FEYXNwCi4tJIzbTomEYRERG5YtRDeQVFxqViB0fP4nnmSn7FrtNgcsFuszl+tgPH4lILfkBERESklCmhvIKyLDmJn9HNK9d1S3J8oZ6v1Ppmqg95j2qDXsUrtEdOnWciiP7pdawp5/K0IyIiInIlKKG8glzNOb9uo6sHZt8ajuuZJ8ML9by5cgDuQc3wbHQ9VW99HregZgDYszNIO7QpTzsiIiIiV4IyjysoxN8Lwz+fPZt0dVxP3vwrluS4POWtqQlY05OxZWdetm5bRgoAhn/aEREREblStHLjCvJyMxPs50lkfBo+HQaRGvY31n9Ovjkz63l8OtyOS7WQf/ah3EPKnuXUuO9dkrYtwpoSj2eDDph9a2C3Wkg7uIHM43sddbvWaABAsL+nFuSIiIjIFaXM4wrr0TiA7zZFgkclAu4e5zgpx5ocy7kV0/J/yGYj48g2Mo5sy/e2Z5OueIS0wmQ00KNRQBlGLyIiIpKXjl68wiLOJtPn49WOny+c5b2e7Njj2LLTMXlVwcU/CK+m3fFq2o2MyN2k7F5O1plDWNMSsFuyMHpUwjWgHl6hN+IVeiMGQ87sheWjuumkHBEREbmilFA6wd2fr2ZLZCJ2Q+lNYTUZDTrLW0RERJxCi3KugOzsbHbv3s3XX3/NjTfeyILX7sFmtZRa/Xa7HVt2Fg80dUXfD0RERORKUw9lGYmLi2Ps2LFs2LCBvXv3kp2d7bhnMBiYsng77686WXrtLZ5Myu5l+Pn50bt3b7p160aXLl1o1qwZJpOp1NoRERER+TcllGVk165dtG7dOt8ew/Hjx/Pyyy8zZWUEHyw9WOK2XujbmOnP3cXOnTsBMBqN2O127HY7vr6+bNq0iUaNGpW4HREREZH8aMi7jLRs2ZJXX301z/XKlSvz1FNPAfBkj4a8N6g5bmYjJqMhT9lLMRkNuJmNTBjUnCd6NOCTTz5x3LPZbNjtdgwGA56enlSrVq1kLyMiIiJyCUooy1BcXO7Nyo1GI6NHj8bb29tx7Z72wSwf1Z0b6vkDYLpMXnk+8byhnj/LR3VncPtgALp06UKLFi0wGC5UYDQa+f3336lSpUppvI6IiIhIvjTkXQaSkpLo0qULe/bsITg4mIyMDKKjo/Hy8uLEiRP4+vrm+1zE2WRGTZ3HjjNZuPgF5r5pt5OdcJphvdvxcLcG+W4NNGvWLIYNG5br2nXXXce2bdvw9PQsrdcTERERyUUJZSnbuHEjvXv3JjU1lXvvvZfZs2ezZcsWunTpwpgxY3j77bcLfDY6OpqgoCCysrL4/c/l1GnWjiyLDVezkfsH9GPn1k107tyZ1atXYzTm7VzOzMykVq1axMXF8cEHH7B3716++eYbfH192bp1K/Xr1y/LVxcREZFrlE7KKSaLxYLRaMyV2L377ru8+uqrGI1GvvnmG0dvYceOHYmMjKRGjRqXrO/OO+8kKysLgC0b1nJz316O+/HRpwFYt24dr732GuPHj89Th5ubG5999hkHDhzgueeew2Aw0KpVK0aNGkWTJk1YsGAB/fv3L5X3FxERETlPPZTF1L9/f5KSkli5ciVGo5G+ffuycuVKqlatyvr162nYsGGR6hs1ahSTJ092rArv3Lkza9euBSA2NjbPwprZs2czZMiQQtX9119/0b9/f7Kysnj33XcZM2ZMkWITERERuZRrPqFMzbRwLC7VMbQc4u+Fl9ulO27Xr19P586dARg6dChLliwhJiaGHj16sHTpUszmonX8/vDDD3mSQ7PZTEJCAl5eXixYsICBAwfmuu/i4sKaNWvo2LFwJ+NERUXRpk0b4uLiGDx4MHPmzClSjCIiIiIFuSYTyoizyXy/KYqVB6KJik/j4l+AAQj286RH4wCGdAymYfW8i19uuukmVqxYgcVy4bSb//u//8t3m6DLSUhIoGbNmmRkZOS5t2TJEvr168fo0aOZPHlyrvYgZ2ui83tPFkZGRgbXX389O3fupHnz5mzevBl3d/cixywiIiJysWtq26Dj8WkMnbGJPh+v5rtNkUT+K5kEsAOR8Wl8tymSPh+vZuiMTRyPT3Pc37FjB3/++Weu5M7V1ZVBgwYVKyZvb2/eeOMNevfunWehzd9//w3A2rVrc7VXs2ZNXnjhBb766qsiteXu7s6OHTu477772LNnD7Vq1SIyMrJYcYuIiIicd830UM7ZEsUbC8Ow2OxYbYV/ZZPRgNloYNxtodzTPphbbrmFxYsX5yl33XXXsW/fvhLF6O/vj9Fo5OOPP2br1q0MHDiQ7t27M3v2bI4fP84NN9xAz5496dChAxs2bChRWx988AEvvvgiLi4uLF68mF69el3+IREREZF8XBMJZWkdcdi3RibTnr0DyDmP+/yvrlatWvTv358vv/wy3+18CsNiseDi4kLv3r1ZtmxZgeWCgoJISkoiMTGxWO1c7M8//+TWW2/FYrHw4YcfMmrUqBLXKSIiIteeCr9t0JwtUaWSTAIsPeOGd8u+tPROZcCAAbRp04bWrVv
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Sequential Team Operations\n",
"# Create an empty directed graph\n",
"graph = nx.DiGraph()\n",
"\n",
"# Outer loop for prefixes 'A', 'B', 'C'\n",
2024-01-08 04:47:01 +01:00
"for prefix in [\"A\", \"B\", \"C\"]:\n",
2023-11-23 07:36:52 +11:00
" # Add 5 nodes with each prefix to the graph using a for loop\n",
" for i in range(5):\n",
" node_id = f\"{prefix}{i}\"\n",
" graph.add_node(node_id, label=node_id)\n",
"\n",
" # Add edges between nodes with the same prefix using a nested for loop\n",
" for source_node in range(5):\n",
" source_id = f\"{prefix}{source_node}\"\n",
" for target_node in range(5):\n",
" target_id = f\"{prefix}{target_node}\"\n",
" if source_node != target_node: # To avoid self-loops\n",
" graph.add_edge(source_id, target_id)\n",
2024-01-08 04:47:01 +01:00
"\n",
"graph.add_edge(\"A0\", \"B0\")\n",
"graph.add_edge(\"B0\", \"C0\")\n",
2023-11-23 07:36:52 +11:00
"\n",
"# Draw the graph\n",
2024-01-08 04:47:01 +01:00
"nx.draw(graph, with_labels=True, font_weight=\"bold\")"
2023-11-23 07:36:52 +11:00
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAApQAAAHzCAYAAACe1o1DAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/SrBM8AAAACXBIWXMAAA9hAAAPYQGoP6dpAAA0Z0lEQVR4nO3dZ3SUdcKG8XtKioFMCkmACEGUIgIWkEWxiw3WFRERcZKQGECQogivWFZZLCAogoA0IyUFQUGQFZSmSBHpmBCQDqFHCErJQkhm3g8qNoLCk+SZcv3O2XMCk0xu3C9XJvP8H4vb7XYLAAAAuEhWswcAAADAuxGUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAABDCEoAAAAYQlACAADAEIISAAAAhhCUAAAAMISgBAAAgCEEJQAAAAwhKAEAAGAIQQkAAFCCQ4cO6bnnntOmTZvMnuLRCEoAAIASLFq0SIMGDVL9+vXVvn17wrIEBCUAAMBfcLvdmjZtGmFZAoISAADgbygqKpLb7daHH36oq666Svfdd5/uvvtuHT9+3OxpprObPQAAAMBbWK1WuVwuRUVFae7cuZKk3Nxc1a9f3+Rl5uIVSgAAgL9gtf6UTFdddZVmzJihsWPHnn2sqKjIrFkeg6AEAAAoQUhIiKRfQzIrK0sPPvigrrzyyrOfQ1DyK28AAIAStWjRQmvWrNF1110ni8Vy9u8jIyPPfkxQEpQAAAAlstvtatSo0Z/+PiIi4uzHZ86cKc9JHolfeQMAAFygoKCgsx/zCiVBCQAAYAhBSVACAABclPDwcEkEpURQAgAAXJSoqChJBKVEUAIAAFyUChUqSOKiHImgBAAAuCi/BCWvUBKUAAAAF+WXQ88JSoISAADgovAK5a8ISgAAgIvwyyuUvIeSoAQAALgodvtPNxy02WwmLzEfQQkAAHCBTp4u0snACAVWraO8M0E6edq/f+1tcbvdbrNHAAAAeLqth44rc0Wuvtycp9z8Av02oCyS4iJDdEfdGDmbxql25VCzZpqCoAQAADiPPfkFemFGtpZsOyyb1aJiV8np9Mvjt9SK0oDWDVU9MqQcl5qHoAQAACjBlFW56jcrR0Uu93lD8o9sVovsVov6P1BfjzaJK8OFnoGgBAAAOIeRX27VW/O2GH6ePvfUUfc7apfCIs/FRTkAAAB/MGVVbqnEpCS9NW+Lpq7KLZXn8lS8QgkAAPAbe/ILdNfQr3S6yHXOx91FZ3Rs5QydzPlSZ344KGtAsIKq11fYTY8qqEqtc35NkN2qBb1u89n3VBKUAAAAv5Hw/gp9vePIOd8z6XYVK2/qyzq1+9s/f6EtQDFt++mSy67980NWi5pdXknpKU3LYLH5+JU3AADAz7YeOq4l2w6XeAHO8bWzz8ZkQHQNRbd+QWHN2v30YPEZHZk9TO6iP985p9jl1pJth7Ut73iZbTcTQQkAAPCzzBW5slktJT5+Yt1nZz+udF8PhdRtpvBbExRcs5Ekqfj4YRVsW3nOr7VZLcr4xjffS0lQAgAA/OzLzXklvjpZ/L/jOnNkz09/sNoVWPXXK7eDLq139uPTe3PO/fUut77ckld6Yz0IQQkAACDpxOki5eYXlPh40Y+Hzn5suyRUFuuv9/C2VQj79fN+OKSS5B4p8MnbNBKUAAAAknYfOanzXansPnPq1z/Y7L97zGK1n/vz/vgcknYdOXmRCz0XQQkAACCpsIRjgn5hCQg++7G7+PcX3rhdRef8vIv5Pt6IoAQAAJAUaD9/FtnDKp/92PW/43K7is/+ufjE0V8/L7yyzuevvo838r1/EQAAwEW4rFIFlXx990/vmwyoVP2nP7iKVXjg1zvpnN7/3dmPg6rVL/E5LD9/H19DUAIAAEiqEGRX3F/cyabidS3OfnzksxEq2Py1ji5O16md6yRJttAohdT6R4lfH1cpRBWC7CU+7q0ISgAAgJ/dUitS1vNcmhPa6J8KrnGNJOnM4Vx9P2OAjn099acHbQGq9M+nZbEHnPNrbVaL7qgTU+qbPQG3XgQAAH5l48aNevnll/W///1PhYWFKiws1KlTp7Rt2zadsIWqasq75/36X+7lfSLnCxX9cOine3lXu0phN7cv8V7ev1jQ61bVigktzX+OR/C911wBAADOY+/evZo+ffo5H7Pbj+m6qpco69CpEg84t9gDFNbsEYU1e+Rvf89f7uXtizEp8StvAADgZ+6++241btxYFsvvL8Gx2Wxau3athsffIPt5br94MexWiwa0bliqz+lJCEoAAOBX3nnnHW3cuFF/fNffsGHD1LBhQ1WPDFH/B0q+UvtivPJAfVX/iwt+vBlBCQAA/EJ6erqioqLUq1cvWSwWXX755bLZbLLZbGrZsqW6det29nMfbRKnPvfUKZXv+3/31FW7JnGl8lyeiqAEAAA+7b///a8uvfRSJSYm6vjx4+rTp4+OHz+uDz74QMXFxYqIiNDEiRP/9Cvw7nfU1hsPNVSQ3SrbBf4K3Ga1KMhu1aCHGqrbHee/UMcXcJU3AADwSUuXLlVSUpK2b98um82m5ORkvfvuuwoMDDz7OaNHj9b111+vJk2alPg8e/IL9MKMbC3Zdlg2q6XEi3UknX38llpRGtC6oU//mvu3CEoAAOBTsrKyFB8fr+zsbFmtVrVp00apqalyOByGnnfroePKXJGrL7fkKfdIwe9Oq7Top0PL76gTo/gb4nz2au6SEJQAAMAn7Ny5U06nU8uXL5fFYtE999yjtLQ0xcSU/mHiJ08XadeRkyoscinQbtVllSr45B1w/i6CEgAAeLVDhw4pMTFR8+bNkyQ1a9ZMGRkZqlmzpsnL/AcX5QAAAK907NgxPfzww4qNjdW8efN0zTXXKCsrS8uWLSMmyxlBCQAAvMqpU6eUkpKiyMhITZ8+XZdffrmWLFmi9evXq2FD3z083JMRlAAAwCsUFxerd+/eCgsL0/jx41WlShV9+umn2rp1q26++Waz5/k1ghIAAHg0l8ul119/XQ6HQ2+//bZCQ0OVlpamvXv36p///KfZ8yCCEgAAeLAxY8YoMjJS//73v2Wz2TRs2DAdPnxYCQkJZk/Db3CVNwAA8DhTp05Vjx499P333ys4OFh9+/bVyy+/LKuV18I8EUEJAAA8xrx589SpUyfl5uYqICBAXbt21ZAhQ2S3++8Zj96AoAQAAKZbsWKFOnTooM2bN8tms8npdGr06NEKCfGPWxd6O3IfAACYZtOmTYqPj9fatWtlsVjUqlUrTZw4UeHh4WZPwwUgKAEAQLnLzc1VfHy8lixZIkm64447lJ6erksvvdTkZbgYBCUAACg3+fn5SkxM1Jw5c+R2u9WkSRNlZmaqdu3aZk+DAVwqBQAAylxBQYGcTqeio6M
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Think aloud and debate\n",
"graph = nx.DiGraph()\n",
"\n",
"for source_node in range(2):\n",
" graph.add_node(source_node, label=source_node)\n",
"\n",
"# Add edges between nodes with the same prefix using a nested for loop\n",
"for source_node in range(2):\n",
" for target_node in range(2):\n",
" graph.add_edge(source_node, target_node)\n",
"\n",
2024-01-08 04:47:01 +01:00
"nx.draw(graph, with_labels=True, font_weight=\"bold\")"
2023-11-23 07:36:52 +11:00
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Set your API Endpoint\n",
"\n",
"The [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"# The default config list in notebook.\n",
"config_list_gpt4 = autogen.config_list_from_json(\n",
" \"OAI_CONFIG_LIST\",\n",
" filter_dict={\n",
" \"model\": [\"gpt-4\", \"gpt-4-0314\", \"gpt4\", \"gpt-4-32k\", \"gpt-4-32k-0314\", \"gpt-4-32k-v0314\"],\n",
" },\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"It first looks for environment variable \"OAI_CONFIG_LIST\" which needs to be a valid json string. If that variable is not found, it then looks for a json file named \"OAI_CONFIG_LIST\". It filters the configs by models (you can filter by other keys as well). Only the gpt-4 models are kept in the list based on the filter condition.\n",
"\n",
"The config list looks like the following:\n",
"```python\n",
"config_list = [\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your OpenAI API key here>',\n",
" },\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your Azure OpenAI API key here>',\n",
" 'base_url': '<your Azure OpenAI API base here>',\n",
" 'api_type': 'azure',\n",
" 'api_version': '2023-06-01-preview',\n",
" },\n",
" {\n",
" 'model': 'gpt-4-32k',\n",
" 'api_key': '<your Azure OpenAI API key here>',\n",
" 'base_url': '<your Azure OpenAI API base here>',\n",
" 'api_type': 'azure',\n",
" 'api_version': '2023-06-01-preview',\n",
" },\n",
"]\n",
"```\n",
"\n",
"If you open this notebook in colab, you can upload your files by clicking the file icon on the left panel and then choosing \"upload file\" icon.\n",
"\n",
"You can set the value of config_list in other ways you prefer, e.g., loading from a YAML file."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We are printing out debug messages so that the reader can understand the conversation flow and select_speaker method better.\n",
"\n",
"Overrides the `select_speaker` method with custom logic including:\n",
" - Handling of `NEXT:` and `TERMINATE` tags in the last message.\n",
" - Selection of the first-round speaker based on the `first_round_speaker` attribute in the graph nodes.\n",
" - Selection of subsequent speakers based on the successors in the graph of the previous speaker.\n",
" - Random selection of the next speaker from the eligible candidates."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"class CustomGroupChat(GroupChat):\n",
" def __init__(self, agents, messages, max_round=10, graph=None):\n",
" super().__init__(agents, messages, max_round)\n",
" self.previous_speaker = None # Keep track of the previous speaker\n",
" self.graph = graph # The graph depicting who are the next speakers available\n",
"\n",
2024-01-08 04:47:01 +01:00
" def select_speaker(self, last_speaker, selector):\n",
2023-11-23 07:36:52 +11:00
" self.previous_speaker = last_speaker\n",
"\n",
" # Check if last message suggests a next speaker or termination\n",
" last_message = self.messages[-1] if self.messages else None\n",
" suggested_next = None\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" if last_message:\n",
2024-01-08 04:47:01 +01:00
" if \"NEXT:\" in last_message[\"content\"]:\n",
" suggested_next = last_message[\"content\"].split(\"NEXT: \")[-1].strip()\n",
2023-11-23 07:36:52 +11:00
" # Strip full stop and comma\n",
2024-01-08 04:47:01 +01:00
" suggested_next = suggested_next.replace(\".\", \"\").replace(\",\", \"\")\n",
2023-11-23 07:36:52 +11:00
" print(f\"Suggested next speaker from the last message: {suggested_next}\")\n",
2024-01-08 04:47:01 +01:00
"\n",
" elif \"TERMINATE\" in last_message[\"content\"]:\n",
2023-11-23 07:36:52 +11:00
" try:\n",
2024-01-08 04:47:01 +01:00
" return self.agent_by_name(\"User_proxy\")\n",
2023-11-23 07:36:52 +11:00
" except ValueError:\n",
" print(f\"agent_by_name failed suggested_next: {suggested_next}\")\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" # Debugging print for the current previous speaker\n",
" if self.previous_speaker is not None:\n",
2024-01-08 04:47:01 +01:00
" print(\"Current previous speaker:\", self.previous_speaker.name)\n",
2023-11-23 07:36:52 +11:00
"\n",
" # Selecting first round speaker\n",
" if self.previous_speaker is None and self.graph is not None:\n",
2024-01-08 04:47:01 +01:00
" eligible_speakers = [\n",
" agent for agent in agents if self.graph.nodes[agent.name].get(\"first_round_speaker\", False)\n",
" ]\n",
" print(\"First round eligible speakers:\", [speaker.name for speaker in eligible_speakers])\n",
2023-11-23 07:36:52 +11:00
"\n",
" # Selecting successors of the previous speaker\n",
" elif self.previous_speaker is not None and self.graph is not None:\n",
" eligible_speaker_names = [target for target in self.graph.successors(self.previous_speaker.name)]\n",
" eligible_speakers = [agent for agent in agents if agent.name in eligible_speaker_names]\n",
2024-01-08 04:47:01 +01:00
" print(\"Eligible speakers based on previous speaker:\", eligible_speaker_names)\n",
2023-11-23 07:36:52 +11:00
"\n",
" else:\n",
" eligible_speakers = agents\n",
"\n",
" # Debugging print for the next potential speakers\n",
2024-01-08 04:47:01 +01:00
" print(\n",
" f\"Eligible speakers based on graph and previous speaker {self.previous_speaker.name if self.previous_speaker else 'None'}: {[speaker.name for speaker in eligible_speakers]}\"\n",
" )\n",
2023-11-23 07:36:52 +11:00
"\n",
" # Three attempts at getting the next_speaker\n",
" # 1. Using suggested_next if suggested_next is in the eligible_speakers.name\n",
" # 2. Using LLM to pick from eligible_speakers, given that there is some context in self.message\n",
" # 3. Random (catch-all)\n",
" next_speaker = None\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" if eligible_speakers:\n",
" print(\"Selecting from eligible speakers:\", [speaker.name for speaker in eligible_speakers])\n",
" # 1. Using suggested_next if suggested_next is in the eligible_speakers.name\n",
" if suggested_next in [speaker.name for speaker in eligible_speakers]:\n",
" print(\"suggested_next is in eligible_speakers\")\n",
" next_speaker = self.agent_by_name(suggested_next)\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" else:\n",
" msgs_len = len(self.messages)\n",
" print(f\"msgs_len is now {msgs_len}\")\n",
" if len(self.messages) > 1:\n",
" # 2. Using LLM to pick from eligible_speakers, given that there is some context in self.message\n",
2024-01-08 04:47:01 +01:00
" print(\n",
" f\"Using LLM to pick from eligible_speakers: {[speaker.name for speaker in eligible_speakers]}\"\n",
" )\n",
2023-11-23 07:36:52 +11:00
" selector.update_system_message(self.select_speaker_msg(eligible_speakers))\n",
2024-01-08 04:47:01 +01:00
" _, name = selector.generate_oai_reply(\n",
" self.messages\n",
" + [\n",
" {\n",
" \"role\": \"system\",\n",
" \"content\": f\"Read the above conversation. Then select the next role from {[agent.name for agent in eligible_speakers]} to play. Only return the role.\",\n",
" }\n",
" ]\n",
" )\n",
2023-11-23 07:36:52 +11:00
"\n",
" # If exactly one agent is mentioned, use it. Otherwise, leave the OAI response unmodified\n",
" mentions = self._mentioned_agents(name, eligible_speakers)\n",
" if len(mentions) == 1:\n",
" name = next(iter(mentions))\n",
" next_speaker = self.agent_by_name(name)\n",
"\n",
" if next_speaker is None:\n",
" # 3. Random (catch-all)\n",
" next_speaker = random.choice(eligible_speakers)\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" print(f\"Selected next speaker: {next_speaker.name}\")\n",
"\n",
" return next_speaker\n",
" else:\n",
" # Cannot return next_speaker with no eligible speakers\n",
" raise ValueError(\"No eligible speakers found based on the graph constraints.\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Demonstration"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Team Operations\n"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABMQAAAP7CAYAAAC0u1IMAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/SrBM8AAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOzdfVzP9+L/8eenkiIJISRRSAgxDobm2hgaMzZsjquzzdfsbOM4Znbsd2bYxs6ur87BsZmZq+ZqDLFmF65SlEqSiyTVKkmlPp/fH9HBzFxU7099HvfbrZvP+937/Xo/P50z6tnr/XqbLBaLRQAAAAAAAICNsDM6AAAAAAAAAFCWKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAAAAAYFMoxAAAAAAAAGBTKMQAAAAAAABgUyjEAAAAgHLi/Pnzeuqpp+Tl5aXKlSvLw8ND/fr10w8//GB0NHl7e2vx4sV3fF5ubq6efPJJtW7dWg4ODho6dGiJZwMA4EYORgcAAAAAcHuGDRum/Px8LV26VE2aNNG5c+e0fft2paWlldo18/Pz5ejoWGrjFxYWytnZWVOnTtXq1atL7ToAAFyLGWIAAABAOZCRkaHvv/9e8+fP1wMPPKBGjRqpY8eOmjlzpgYPHnzdcRMmTFDt2rXl6uqqnj176tChQ9eN9c033+i+++6Tk5OT3N3dFRwcXPw5b29vvfrqqxo7dqxcXV01adIkSVJYWJi6desmZ2dnNWzYUFOnTtXFixclSUFBQUpMTNRzzz0nk8kkk8l02++ratWq+uCDDzRx4kR5eHjcy5cIAIDbRiEGAAAAlAMuLi5ycXHRunXrlJeX97vHPfLII0pJSdHmzZu1f/9+BQYGqlevXkpPT5ckbdy4UcHBwXrwwQd18OBBbd++XR07drxujDfeeENt2rTRwYMHNXv2bMXHx6t///4aNmyYIiIitHLlSoWFhWnKlCmSpDVr1sjT01Nz587V2bNndfbs2eKxTCaTlixZUvJfEAAA7oHJYrFYjA4BAAAA4I+tXr1aEydO1KVLlxQYGKgePXpo5MiRCggIkFQ0i2vgwIFKSUlR5cqVi8/z9fXV9OnTNWnSJHXp0kVNmjTR8uXLb3oNb29vtWvXTmvXri3eN2HCBNnb2+ujjz4q3hcWFqYePXro4sWLcnJykre3t6ZNm6Zp06ZdN56fn5/mzZt33Sy03/Pkk08qIyND69atu4OvCgAAd44ZYgAAAEA5MWzYMCUlJSkkJET9+/dXaGioAgMDi2dgHTp0SNnZ2apVq1bxjDIXFxclJCQoPj5ekhQeHq5evXrd8jodOnS4bvvQoUNasmTJdWP269dPZrNZCQkJtxzr6NGjt1WGAQBQllhUHwAAAChHnJyc1KdPH/Xp00ezZ8/WhAkTNGfOHD355JPKzs5WvXr1FBoa+pvz3NzcJEnOzs5/eI2qVatet52dna3Jkydr6tSpvznWy8vrrt4HAABGohADAAAAyjF/f//iWwwDAwOVnJwsBwcHeXt73/T4gIAAbd++XePGjbvtawQGBioqKkq+vr6/e4yjo6MKCwvvJDoAAIbhlkkAAACgHEhLS1PPnj21fPlyRUREKCEhQatWrdKCBQs0ZMgQSVLv3r3VuXNnDR06VFu3btWJEye0Z88ezZo1S/v27ZMkzZkzRytWrNCcOXMUHR2tyMhIzZ8//5bXnjFjhvbs2aMpU6YoPDxccXFxWr9+ffGi+lLR2mO7d+/WmTNnlJqaWrzfz8/vuvXIbiYqKkrh4eFKT09XZmamwsPDFR4efpdfKQAA/hgzxAAAAIBywMXFRZ06ddKiRYsUHx+vy5cvq2HDhpo4caL+/ve/Syp6ouOmTZs0a9YsjRs3TufPn5eHh4e6d++uunXrSpKCgoK0atUqvfrqq3r99dfl6uqq7t273/LaAQEB2rVrl2bNmqVu3brJYrHIx8dHjz76aPExc+fO1eTJk+Xj46O8vDxdfXZXTEyMMjMzbzn+gw8+qMTExOLtdu3aSZJ4/hcAoLTwlEkAAAAAAADYFG6ZBAAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATaEQAwAAAAAAgE2hEAMAAAAAAIBNoRADAAAAAACATXEwOgAAAAAA62GxWBR6IlSb4jbpl6RfdCTliHILclXJvpJ8avioU4NOeqDxAxrcfLAc7R2NjgsAwF0xWSwWi9EhAAAAABjLYrHoP+H/0byweTqWfkwOdg4qMBf85rhKdpV02XxZtZxraWqnqZredbqcHJwMSAwAwN2jEAMAAABsXGJGop5c/6RCT4TKJJMsur0fEexMdmpSo4mWBy9XJ89OpZwSAICSQyEGAAAA2LBDyYfUc2lPZeVn3XRG2B+xN9nLZDJp5fCVerjFw6WQEACAkkchBgAAANiouLQ4dfq0k7LyslRoKbzrcUwyyc5kpw2PbVB/3/4lmBAAgNJBIQYAAADYoAJzgf706Z8Unhx+T2XYVSaZ5ObkppgpMapdtXYJJAQAoPTwlEkAAADABi36cZEOnD1QtF7YN5L2X/PJXpK63XDCr5J+lnRKUrKkqx1aD0kPSBZZlJWXpSmbpmjlIytLOz4AAPfEzugAAAAAAMpWbkGuXgt7ragMK5QUdcMBh29yUrKknySd0f/KsBsUWgr1VdRXik2LLcm4AACUOAoxAAAAwMZ8HfW1MnIzijbiJV264YBzks7fsM9RUhMVzQhr/vtjO5gc9MHeD0omKAAApYRCDAAAALAxq6NWy8505UeBa2eDtbrm9Y2zxHwkjZX0gCT33x+7wFKgL498WRIxAQAoNRRiAAAAgI356cxPMlvM0mVJR6/srCKpv/73E8LNbpu8TcnZyTp/8cYpZgAAWA8KMQAAAMCGZORmKDk7uWgjVlL+lU/4SXKR5H1lO03S2bu/TsS5iLs/GQCAUkYhBgAAANiQC3kX/rdx7Sww/xv+vPHzdygrL+vuTwYAoJRRiAEAAAA2xN7OvuhFnqS4KzudJTW+8rqFJNOV14clWe7uOg52DneZEACA0se/UgAAAIANqV2lthzsHFRwtEAquLLzkqRXb3JwpqRTkrzu/DoNXBvcdUYAAEobhRgAAABgQyrZV1Kr2q0UHhl+eycc1h0XYvYme/m7+ysvL0+xsbGKjo4u/khISNCnn36q1q1b32l0AABKDIUYAAAAYGM61uyo8OPhRRuOknrdcEChpK1XXkep6OmTlySduLIv7Zpjz0s6cuW1t2SqalLhyUJVqVxFFsv/7rd0cHBQYWGhLBaLLl6
"text/plain": [
"<Figure size 1200x1000 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# llm config\n",
2024-01-08 04:47:01 +01:00
"llm_config = {\"config_list\": config_list_gpt4, \"cache_seed\": 100}\n",
2023-11-23 07:36:52 +11:00
"\n",
"# Create an empty directed graph\n",
"graph = nx.DiGraph()\n",
"\n",
"agents = []\n",
"\n",
"# Outer loop for prefixes 'A', 'B', 'C'\n",
2024-01-08 04:47:01 +01:00
"for prefix in [\"A\", \"B\", \"C\"]:\n",
2023-11-23 07:36:52 +11:00
" # Add 3 nodes with each prefix to the graph using a for loop\n",
" for i in range(3):\n",
" node_id = f\"{prefix}{i}\"\n",
" secret_value = random.randint(1, 5) # Generate a random secret value\n",
" graph.add_node(node_id, label=node_id, secret_value=secret_value)\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" # Create an AssistantAgent for each node (assuming AssistantAgent is a defined class)\n",
2024-01-08 04:47:01 +01:00
" agents.append(\n",
" AssistantAgent(\n",
" name=node_id,\n",
" system_message=f\"\"\"Your name is {node_id}.\n",
2023-11-23 07:36:52 +11:00
" Do not respond as the speaker named in the NEXT tag if your name is not in the NEXT tag. Instead, suggest a relevant team leader to handle the mis-tag, with the NEXT: tag.\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" You have {secret_value} chocolates.\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" The list of players are [A0, A1, A2, B0, B1, B2, C0, C1, C2].\n",
2024-01-08 04:47:01 +01:00
"\n",
" Your first character of your name is your team, and your second character denotes that you are a team leader if it is 0.\n",
2023-11-23 07:36:52 +11:00
" CONSTRAINTS: Team members can only talk within the team, whilst team leader can talk to team leaders of other teams but not team members of other teams.\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" You can use NEXT: to suggest the next speaker. You have to respect the CONSTRAINTS, and can only suggest one player from the list of players, i.e., do not suggest A3 because A3 is not from the list of players.\n",
" Team leaders must make sure that they know the sum of the individual chocolate count of all three players in their own team, i.e., A0 is responsible for team A only.\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" Keep track of the player's tally using a JSON format so that others can check the total tally. Use\n",
" A0:?, A1:?, A2:?,\n",
" B0:?, B1:?, B2:?,\n",
" C0:?, C1:?, C2:?\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" If you are the team leader, you should aggregate your team's total chocolate count to cooperate.\n",
" Once the team leader know their team's tally, they can suggest another team leader for them to find their team tally, because we need all three team tallys to succeed.\n",
2024-01-07 02:41:33 +01:00
" Use NEXT: to suggest the next speaker, e.g., NEXT: A0.\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" Once we have the total tally from all nine players, sum up all three teams' tally, then terminate the discussion using TERMINATE.\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
" \"\"\",\n",
2024-01-08 04:47:01 +01:00
" llm_config=llm_config,\n",
" )\n",
" )\n",
2023-11-23 07:36:52 +11:00
"\n",
" # Add edges between nodes with the same prefix using a nested for loop\n",
" for source_node in range(3):\n",
" source_id = f\"{prefix}{source_node}\"\n",
" for target_node in range(3):\n",
" target_id = f\"{prefix}{target_node}\"\n",
" if source_node != target_node: # To avoid self-loops\n",
" graph.add_edge(source_id, target_id)\n",
"\n",
"# Adding edges between teams\n",
2024-01-08 04:47:01 +01:00
"graph.add_edge(\"A0\", \"B0\")\n",
"graph.add_edge(\"A0\", \"C0\")\n",
"graph.add_edge(\"B0\", \"A0\")\n",
"graph.add_edge(\"B0\", \"C0\")\n",
"graph.add_edge(\"C0\", \"A0\")\n",
"graph.add_edge(\"C0\", \"B0\")\n",
2023-11-23 07:36:52 +11:00
"\n",
"\n",
"# Updating node A0\n",
2024-01-08 04:47:01 +01:00
"graph.nodes[\"A0\"][\"first_round_speaker\"] = True\n",
"\n",
2023-11-23 07:36:52 +11:00
"\n",
"def get_node_color(node):\n",
2024-01-08 04:47:01 +01:00
" if graph.nodes[node].get(\"first_round_speaker\", False):\n",
" return \"red\"\n",
2023-11-23 07:36:52 +11:00
" else:\n",
2024-01-08 04:47:01 +01:00
" return \"green\"\n",
"\n",
2023-11-23 07:36:52 +11:00
"\n",
"# Draw the graph with secret values annotated\n",
"plt.figure(figsize=(12, 10))\n",
"pos = nx.spring_layout(graph) # positions for all nodes\n",
"\n",
"# Draw nodes with their colors\n",
2024-01-08 04:47:01 +01:00
"nx.draw(graph, pos, with_labels=True, font_weight=\"bold\", node_color=[get_node_color(node) for node in graph])\n",
2023-11-23 07:36:52 +11:00
"\n",
"# Annotate secret values\n",
"for node, (x, y) in pos.items():\n",
2024-01-08 04:47:01 +01:00
" secret_value = graph.nodes[node][\"secret_value\"]\n",
" plt.text(x, y + 0.1, s=f\"Secret: {secret_value}\", horizontalalignment=\"center\")\n",
2023-11-23 07:36:52 +11:00
"\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"# Termination message detection\n",
2024-01-08 04:47:01 +01:00
"\n",
"\n",
2023-11-23 07:36:52 +11:00
"def is_termination_msg(content) -> bool:\n",
" have_content = content.get(\"content\", None) is not None\n",
" if have_content and \"TERMINATE\" in content[\"content\"]:\n",
" return True\n",
" return False\n",
"\n",
2024-01-08 04:47:01 +01:00
"\n",
2023-11-23 07:36:52 +11:00
"# Terminates the conversation when TERMINATE is detected.\n",
"user_proxy = autogen.UserProxyAgent(\n",
2024-01-08 04:47:01 +01:00
" name=\"User_proxy\",\n",
" system_message=\"Terminator admin.\",\n",
" code_execution_config=False,\n",
" is_termination_msg=is_termination_msg,\n",
" human_input_mode=\"NEVER\",\n",
")\n",
2023-11-23 07:36:52 +11:00
"\n",
"agents.append(user_proxy)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33mA0\u001b[0m (to chat_manager):\n",
"\n",
"\n",
" There are 9 players in this game, split equally into Teams A, B, C. Therefore each team has 3 players, including the team leader. \n",
" The task is to find out the sum of chocolate count from all nine players. I will now start with my team. \n",
" NEXT: A1\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: A1\n",
"Current previous speaker: A0\n",
"Eligible speakers based on previous speaker: ['A1', 'A2', 'B0', 'C0']\n",
"Eligible speakers based on graph and previous speaker A0: ['A1', 'A2', 'B0', 'C0']\n",
"Selecting from eligible speakers: ['A1', 'A2', 'B0', 'C0']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: A1\n",
"\u001b[33mA1\u001b[0m (to chat_manager):\n",
"\n",
"As A1 I have 1 chocolate right now. Our team leader A0, please note my count.\n",
"\n",
"NEXT: A2\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: A2\n",
"Current previous speaker: A1\n",
"Eligible speakers based on previous speaker: ['A0', 'A2']\n",
"Eligible speakers based on graph and previous speaker A1: ['A0', 'A2']\n",
"Selecting from eligible speakers: ['A0', 'A2']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: A2\n",
"\u001b[33mA2\u001b[0m (to chat_manager):\n",
"\n",
"As part of Team A, I have 2 chocolates at the moment.\n",
"\n",
"Now that each member of Team A has reported their tally, our team leader A0 should be able to calculate and report our team's total sum to the other team leaders. \n",
"\n",
"NEXT: A0.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: A0\n",
"Current previous speaker: A2\n",
"Eligible speakers based on previous speaker: ['A0', 'A1']\n",
"Eligible speakers based on graph and previous speaker A2: ['A0', 'A1']\n",
"Selecting from eligible speakers: ['A0', 'A1']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: A0\n",
"\u001b[33mA0\u001b[0m (to chat_manager):\n",
"\n",
"I, A0, have 4 chocolates, A1 reported having 1 chocolate, and A2 reported having 2 chocolates. So the total chocolate count for Team A is 4 + 1 + 2 = 7 chocolates. \n",
"\n",
"I'm saving this in our JSON format as: \n",
"A0:4, A1:1, A2:2,\n",
"B0:?, B1:?, B2:?,\n",
"C0:?, C1:?, C2:?\n",
"\n",
"Let's move on to Team B for their counts.\n",
"NEXT: B0.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: B0\n",
"Current previous speaker: A0\n",
"Eligible speakers based on previous speaker: ['A1', 'A2', 'B0', 'C0']\n",
"Eligible speakers based on graph and previous speaker A0: ['A1', 'A2', 'B0', 'C0']\n",
"Selecting from eligible speakers: ['A1', 'A2', 'B0', 'C0']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: B0\n",
"\u001b[33mB0\u001b[0m (to chat_manager):\n",
"\n",
"As B0, the team leader of Team B, I already have my count which is 5 chocolates. Now, I will ask the other members of my team to report their counts. \n",
"\n",
"NEXT: B1\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: B1\n",
"Current previous speaker: B0\n",
"Eligible speakers based on previous speaker: ['B1', 'B2', 'A0', 'C0']\n",
"Eligible speakers based on graph and previous speaker B0: ['A0', 'B1', 'B2', 'C0']\n",
"Selecting from eligible speakers: ['A0', 'B1', 'B2', 'C0']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: B1\n",
"\u001b[33mB1\u001b[0m (to chat_manager):\n",
"\n",
"As B1, I have 4 chocolates currently. It's now time for our team member B2 to report their count. \n",
"\n",
"NEXT: B2.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: B2\n",
"Current previous speaker: B1\n",
"Eligible speakers based on previous speaker: ['B0', 'B2']\n",
"Eligible speakers based on graph and previous speaker B1: ['B0', 'B2']\n",
"Selecting from eligible speakers: ['B0', 'B2']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: B2\n",
"\u001b[33mB2\u001b[0m (to chat_manager):\n",
"\n",
"As B2, I have 1 chocolate right now. Our team leader B0, please note my count.\n",
"\n",
"NEXT: B0.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: B0\n",
"Current previous speaker: B2\n",
"Eligible speakers based on previous speaker: ['B0', 'B1']\n",
"Eligible speakers based on graph and previous speaker B2: ['B0', 'B1']\n",
"Selecting from eligible speakers: ['B0', 'B1']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: B0\n",
"\u001b[33mB0\u001b[0m (to chat_manager):\n",
"\n",
"As B0, I acknowledge receipt of both B1 and B2's counts. I have 5 chocolates, B1 has 4 and B2 has 1. Adding these counts together, Team B has a total of 5 + 4 + 1 = 10 chocolates.\n",
"\n",
"Updating the JSON tally:\n",
"A0:4, A1:1, A2:2,\n",
"B0:5, B1:4, B2:1,\n",
"C0:?, C1:?, C2:?\n",
"\n",
"Now it's time for Team C to share their counts. \n",
"\n",
"NEXT: C0.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: C0\n",
"Current previous speaker: B0\n",
"Eligible speakers based on previous speaker: ['B1', 'B2', 'A0', 'C0']\n",
"Eligible speakers based on graph and previous speaker B0: ['A0', 'B1', 'B2', 'C0']\n",
"Selecting from eligible speakers: ['A0', 'B1', 'B2', 'C0']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: C0\n",
"\u001b[33mC0\u001b[0m (to chat_manager):\n",
"\n",
"As C0, I currently have 2 chocolates. I will need the counts from C1 and C2 to complete our team's tally. \n",
"\n",
"NEXT: C1.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: C1\n",
"Current previous speaker: C0\n",
"Eligible speakers based on previous speaker: ['C1', 'C2', 'A0', 'B0']\n",
"Eligible speakers based on graph and previous speaker C0: ['A0', 'B0', 'C1', 'C2']\n",
"Selecting from eligible speakers: ['A0', 'B0', 'C1', 'C2']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: C1\n",
"\u001b[33mC1\u001b[0m (to chat_manager):\n",
"\n",
"As C1, I have 2 chocolates. C2, please share your count so that our team leader, C0, can calculate our team's total.\n",
"\n",
"NEXT: C2.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: C2\n",
"Current previous speaker: C1\n",
"Eligible speakers based on previous speaker: ['C0', 'C2']\n",
"Eligible speakers based on graph and previous speaker C1: ['C0', 'C2']\n",
"Selecting from eligible speakers: ['C0', 'C2']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: C2\n",
"\u001b[33mC2\u001b[0m (to chat_manager):\n",
"\n",
"As C2, I have 5 chocolates. Now our team leader, C0, can calculate our team's total sum.\n",
"\n",
"NEXT: C0.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"Suggested next speaker from the last message: C0\n",
"Current previous speaker: C2\n",
"Eligible speakers based on previous speaker: ['C0', 'C1']\n",
"Eligible speakers based on graph and previous speaker C2: ['C0', 'C1']\n",
"Selecting from eligible speakers: ['C0', 'C1']\n",
"suggested_next is in eligible_speakers\n",
"Selected next speaker: C0\n",
"\u001b[33mC0\u001b[0m (to chat_manager):\n",
"\n",
"As C0, I have 2 chocolates, C1 reported having 2 chocolates, and C2 reported having 5 chocolates. So, the total chocolate count for Team C is 2 + 2 + 5 = 9 chocolates.\n",
"\n",
"Updating the JSON tally:\n",
"A0:4, A1:1, A2:2,\n",
"B0:5, B1:4, B2:1,\n",
"C0:2, C1:2, C2:5\n",
"\n",
"Let's sum up all the team totals. \n",
"\n",
"TERMINATE.\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
}
],
"source": [
2024-01-08 04:47:01 +01:00
"group_chat = CustomGroupChat(agents=agents, messages=[], max_round=20, graph=graph) # Include all agents\n",
2023-11-23 07:36:52 +11:00
"\n",
"\n",
"# Create the manager\n",
"manager = autogen.GroupChatManager(groupchat=group_chat, llm_config=llm_config)\n",
"\n",
"\n",
"# Initiates the chat with Alice\n",
2024-01-08 04:47:01 +01:00
"agents[0].initiate_chat(\n",
" manager,\n",
" message=\"\"\"\n",
" There are 9 players in this game, split equally into Teams A, B, C. Therefore each team has 3 players, including the team leader.\n",
" The task is to find out the sum of chocolate count from all nine players. I will now start with my team.\n",
" NEXT: A1\"\"\",\n",
")"
2023-11-23 07:36:52 +11:00
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
2024-01-07 02:41:33 +01:00
"version": "3.10.13"
2023-11-23 07:36:52 +11:00
}
},
"nbformat": 4,
"nbformat_minor": 4
}