2024-05-28 14:55:40 -07:00
This example shows how to use @AutoGen .Ollama.OllamaAgent to connect to Ollama server and chat with LLaVA model.
To run this example, you need to have an Ollama server running aside and have `llama3:latest` model installed. For how to setup an Ollama server, please refer to [Ollama ](https://ollama.com/ ).
> [!NOTE]
2024-10-02 09:14:54 -07:00
> You can find the complete sample code [here](https://github.com/microsoft/autogen/blob/main/dotnet/samples/AutoGen.Ollama.Sample/Chat_With_LLaMA.cs)
2024-05-28 14:55:40 -07:00
### Step 1: Install AutoGen.Ollama
First, install the AutoGen.Ollama package using the following command:
```bash
dotnet add package AutoGen.Ollama
```
For how to install from nightly build, please refer to [Installation ](../Installation.md ).
### Step 2: Add using statement
2024-10-02 09:14:54 -07:00
[!code-csharp[ ](../../../samples/AutoGen.Ollama.Sample/Chat_With_LLaMA.cs?name=Using )]
2024-05-28 14:55:40 -07:00
### Step 3: Create and chat @AutoGen.Ollama.OllamaAgent
In this step, we create an @AutoGen .Ollama.OllamaAgent and connect it to the Ollama server.
2024-10-02 09:14:54 -07:00
[!code-csharp[ ](../../../samples/AutoGen.Ollama.Sample/Chat_With_LLaMA.cs?name=Create_Ollama_Agent )]
2024-05-28 14:55:40 -07:00