diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb index 8674b4195..cf719e10d 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb @@ -373,6 +373,7 @@ "```{note}\n", "While some model providers may offer OpenAI-compatible APIs, they may still have minor differences.\n", "For example, the `finish_reason` field may be different in the response.\n", + "\n", "```" ] }, @@ -403,6 +404,29 @@ "await model_client.close()" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Also, as Gemini adds new models, you may need to define the models capabilities via the model_info field. For example, to use `gemini-2.0-flash-lite` or a similar new model, you can use the following code:\n", + "\n", + "```python \n", + "from autogen_core.models import UserMessage\n", + "from autogen_ext.models.openai import OpenAIChatCompletionClient\n", + "from autogen_core.models import ModelInfo\n", + "\n", + "model_client = OpenAIChatCompletionClient(\n", + " model=\"gemini-2.0-flash-lite\",\n", + " model_info=ModelInfo(vision=True, function_calling=True, json_output=True, family=\"unknown\", structured_output=True)\n", + " # api_key=\"GEMINI_API_KEY\",\n", + ")\n", + "\n", + "response = await model_client.create([UserMessage(content=\"What is the capital of France?\", source=\"user\")])\n", + "print(response)\n", + "await model_client.close()\n", + "```" + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -516,7 +540,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.12.7" + "version": "3.11.9" } }, "nbformat": 4,