autogen/website/docs/Examples/AutoGen-OpenAI.md
Chi Wang 7ab4d114d7
silent; code_execution_config; exit; version (#1179)
* silent; code_execution_config; exit; version

* url

* url

* readme

* preview

* doc

* url

* endpoints

* timeout

* chess

* Fix retrieve chat

* config

* mathchat

---------

Co-authored-by: Li Jiang <bnujli@gmail.com>
2023-08-14 07:09:45 +00:00

9 lines
887 B
Markdown

# AutoGen - Tune GPT Models
`flaml.autogen` offers a cost-effective hyperparameter optimization technique [EcoOptiGen](https://arxiv.org/abs/2303.04673) for tuning Large Language Models. The research study finds that tuning hyperparameters can significantly improve the utility of them.
Please find documentation about this feature [here](/docs/Use-Cases/Autogen#enhanced-inference).
Links to notebook examples:
* [Optimize for Code Generation](https://github.com/microsoft/FLAML/blob/main/notebook/autogen_openai_completion.ipynb) | [Open in colab](https://colab.research.google.com/github/microsoft/FLAML/blob/main/notebook/autogen_openai_completion.ipynb)
* [Optimize for Math](https://github.com/microsoft/FLAML/blob/main/notebook/autogen_chatgpt_gpt4.ipynb) | [Open in colab](https://colab.research.google.com/github/microsoft/FLAML/blob/main/notebook/autogen_chatgpt_gpt4.ipynb)