2024-10-02 12:57:50 -04:00

1.1 KiB

This example shows how to use @AutoGen.Ollama.OllamaAgent to connect to Ollama server and chat with LLaVA model.

To run this example, you need to have an Ollama server running aside and have llama3:latest model installed. For how to setup an Ollama server, please refer to Ollama.

Note

You can find the complete sample code here

Step 1: Install AutoGen.Ollama

First, install the AutoGen.Ollama package using the following command:

dotnet add package AutoGen.Ollama

For how to install from nightly build, please refer to Installation.

Step 2: Add using statement

[!code-csharp]

Step 3: Create and chat @AutoGen.Ollama.OllamaAgent

In this step, we create an @AutoGen.Ollama.OllamaAgent and connect it to the Ollama server.

[!code-csharp]