"FLAML offers an experimental feature of interactive LLM agents, which can be used to solve various tasks with human or automatic feedback, including tasks that require using tools via code.\n",
"In this notebook, we demonstrate how to use multiple agents to work together and accomplish a task which requires finding info from the web and coding. `AssistantAgent` is an LLM-based agent that can write Python code (in a Python coding block) for a user to execute for a given task. `UserProxyAgent` is an agent which serves as a proxy for a user to execute the code written by `AssistantAgent`. We further create a planning agent for the assistant agent to consult. The planning agent is a variation of the `AssistantAgent`.\n",
"* The [`config_list_openai_aoai`](https://microsoft.github.io/FLAML/docs/reference/autogen/oai/openai_utils#config_list_openai_aoai) function tries to create a list of configurations using Azure OpenAI endpoints and OpenAI endpoints. \n",
"\n",
"It assumes the api keys and api bases are stored in the corresponding environment variables or local txt files:\n",
"\n",
"- OpenAI API key: os.environ[\"OPENAI_API_KEY\"] or `openai_api_key_file=\"key_openai.txt\"`.\n",
"- Azure OpenAI API key: os.environ[\"AZURE_OPENAI_API_KEY\"] or `aoai_api_key_file=\"key_aoai.txt\"`. Multiple keys can be stored, one per line.\n",
"- Azure OpenAI API base: os.environ[\"AZURE_OPENAI_API_BASE\"] or `aoai_api_base_file=\"base_aoai.txt\"`. Multiple bases can be stored, one per line.\n",
"It's OK to have only the OpenAI API key, or only the Azure OpenAI API key + base. If you open this notebook in colab, you can upload your files by clicking the file icon on the left panel and then choose \"upload file\" icon.\n"
"We construct the planning agent named \"planner\" and a user proxy agent for the planner named \"planner_user\". We specify `human_input_mode` as \"TERMINATE\" in the user proxy agent, which will never ask for human feedback. We define `ask_planner` function to send a message to planner and return the suggestion from the planner."
" # the default system message of the AssistantAgent is overwritten here\n",
" system_message=\"You are a helpful AI assistant. You suggest coding and reasoning steps for another AI assistant to accomplish a task. Do not suggest concrete code. For any action beyond writing code or reasoning, convert it to a step which can be implemented by writing code. For example, the action of browsing the web can be implemented by writing code which reads and prints the content of a web page. Finally, inspect the execution result. If the plan is not good, suggest a better plan. If the execution is wrong, point out the error and suggest a fix.\"\n",
")\n",
"planner_user = UserProxyAgent(\n",
" name=\"planner_user\",\n",
" max_consecutive_auto_reply=0, # terminate without auto-reply\n",
"We construct the assistant agent and the user proxy agent. We specify `human_input_mode` as \"TERMINATE\" in the user proxy agent, which will ask for feedback when it receives a \"TERMINATE\" signal from the assistant agent. We set the `functions` in `AssistantAgent` and `function_map` in UserProxyAgent to use the created `ask_planner` function."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# create an AssistantAgent instance named \"assistant\"\n",
"assistant = AssistantAgent(\n",
" name=\"assistant\",\n",
" request_timeout=600,\n",
" seed=42,\n",
" # Excluding azure openai endpoints from the config list.\n",
" # Change to `exclude=\"openai\"` to exclude openai endpoints, or remove the `exclude` argument to include both.\n",
" \"description\": \"question to ask planner. Make sure the question include enough context, such as the code and the execution result. The planner does not know the conversation between you and the user, unless you share the conversation with the planner.\",\n",
"We invoke the `receive()` method of the coding agent to start the conversation. When you run the cell below, you will be prompted to provide feedback after the coding agent sends a \"TERMINATE\" signal in the end of the message. If you don't provide any feedback (by pressing Enter directly), the conversation will finish. Before the \"TERMINATE\" signal, the user proxy agent will try to execute the code suggested by the coding agent on behalf of the user."
"To fix a good first issue in an open source project like FLAML (Fast, Lightweight AutoML), you need to first find an issue. I can assist in finding an issue and provide code examples for how you might address it.\n",
"Please replace 'username:token' with your actual GitHub username and a [personal access token](https://github.com/settings/tokens) you have created. If you do not have a personal access token, please follow the provided link to create one.\n",
"Before proceeding with a fix, inspect the issue, understand it and plan an approach for the fix. Once you have a rough idea for a fix, fork the FLAML repository, create a new branch, implement the fix and finally open a pull request to the main FLAML repository. Please check the project guidelines for contributing if they exist. Also keep the conversation with the repository maintainers active for effective contribution. Good luck!\n",
"***** Suggested function Call: ask_planner *****\n",
"Arguments: \n",
"{\n",
"\"message\": \"The user would like a fix for a good first issue from the FLAML project. The issue retrieved from GitHub is 'Do not print data split method when user provides X_val in AutoML: https://github.com/microsoft/FLAML/issues/1099'. Assume the flaml software is installed and ready to use. What would be the steps for a fix?\"\n",
"The user would like a fix for a good first issue from the FLAML project. The issue retrieved from GitHub is 'Do not print data split method when user provides X_val in AutoML: https://github.com/microsoft/FLAML/issues/1099'. Assume the flaml software is installed and ready to use. What would be the steps for a fix?\n",
"1. Understand the problem: Read the issue description to get a clear understanding of the problem. In this case, it is about not printing the data split method when the user provides the `X_val` parameter during the AutoML process.\n",
"2. Locate the relevant code: Identify the relevant code section in the FLAML project where the data split message is printed. This might require browsing through the source code, specifically focusing on the AutoML and data splitting components.\n",
"3. Analyze the current behavior: Check the conditions under which the data split message is printed and understand how the provided `X_val` parameter affects the code execution.\n",
"4. Implement the fix: Modify the code, adding a condition that checks whether the `X_val` parameter has been provided by the user. If it has been provided, do not print the data split message.\n",
"5. Test the fix: Create a test case or use an existing one that covers the scenario when the user provides the `X_val` parameter. Run the test to ensure that the data split message is not printed in this case.\n",
"6. Verify other scenarios: Test other scenarios that include different combinations of input parameters and data split methods to ensure that your fix doesn't introduce any new issues or side effects.\n",
"7. Document the changes: Update the documentation, including comments in the code and any relevant user guides, to accurately reflect the changes made.\n",
"8. Create a pull request: Once you have tested and verified the fix, create a pull request to merge your changes into the main branch of the FLAML project. In the pull request description, link to the original issue to provide context for the reviewers.\n",
"9. Address review comments: If the maintainers or other contributors provide feedback on your pull request, address their comments, and make necessary changes to the code until your pull request is approved.\n",
"1. Understand the problem: Read the issue description to get a clear understanding of the problem. In this case, it is about not printing the data split method when the user provides the `X_val` parameter during the AutoML process.\n",
"2. Locate the relevant code: Identify the relevant code section in the FLAML project where the data split message is printed. This might require browsing through the source code, specifically focusing on the AutoML and data splitting components.\n",
"3. Analyze the current behavior: Check the conditions under which the data split message is printed and understand how the provided `X_val` parameter affects the code execution.\n",
"4. Implement the fix: Modify the code, adding a condition that checks whether the `X_val` parameter has been provided by the user. If it has been provided, do not print the data split message.\n",
"5. Test the fix: Create a test case or use an existing one that covers the scenario when the user provides the `X_val` parameter. Run the test to ensure that the data split message is not printed in this case.\n",
"6. Verify other scenarios: Test other scenarios that include different combinations of input parameters and data split methods to ensure that your fix doesn't introduce any new issues or side effects.\n",
"7. Document the changes: Update the documentation, including comments in the code and any relevant user guides, to accurately reflect the changes made.\n",
"8. Create a pull request: Once you have tested and verified the fix, create a pull request to merge your changes into the main branch of the FLAML project. In the pull request description, link to the original issue to provide context for the reviewers.\n",
"9. Address review comments: If the maintainers or other contributors provide feedback on your pull request, address their comments, and make necessary changes to the code until your pull request is approved.\n",
"This script modifies the `fit()` method of the `AutoML` class in FLAML. Before this script can be correctly executed, it should replace the actual code in the FLAML project.\n",
"Please be sure to thoroughly test the fix and to re-run all unit tests for the `AutoML` class and integration tests involving the `AutoML` class to make sure everything still works as expected.\n",
"After you have tested and confirmed that everything is working as expected, you can commit your change to your local git repository and push it to your remote repository. Then you can create a pull request on the FLAML repository for your changes to be reviewed and possibly merged into the main branch.\n",
"This is only a hypothetical fix. In the actual FLAML project, the relevant code section may be in different files or classes/functions or need more complex modifications. Always carefully examine the specific codebase and the issue details before making modifications.\n",
"Keep in mind that this was a hypothetical fix and it may not directly apply to the real FLAML project. Always make sure to read the project guidelines for contributing, understand the issue properly and inspect the codebase thoroughly before making changes.\n",
"When the assistant needs to consult the planner, it suggests a function call to `ask_planner`. When this happens, a line like the following will be displayed:\n",
"\n",
"***** Suggested function Call: ask_planner *****\n"