This example shows how to use function call with local LLM models where [Ollama](https://ollama.com/) as local model provider and [LiteLLM](https://docs.litellm.ai/docs/) proxy server which provides an openai-api compatible interface.
This will start an openai-api compatible proxy server at `http://localhost:4000`. You can verify if the server is running by observing the following output in the terminal:
INFO: Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit)
```
## Install AutoGen and AutoGen.SourceGenerator
In your project, install the AutoGen and AutoGen.SourceGenerator package using the following command:
```bash
dotnet add package AutoGen
dotnet add package AutoGen.SourceGenerator
```
The `AutoGen.SourceGenerator` package is used to automatically generate type-safe `FunctionContract` instead of manually defining them. For more information, please check out [Create type-safe function](Create-type-safe-function-call.md).
And in your project file, enable structural xml document support by setting the `GenerateDocumentationFile` property to `true`:
```xml
<PropertyGroup>
<!-- This enables structural xml document support -->
## Define `WeatherReport` function and create @AutoGen.Core.FunctionCallMiddleware
Create a `public partial` class to host the methods you want to use in AutoGen agents. The method has to be a `public` instance method and its return type must be `Task<string>`. After the methods are defined, mark them with `AutoGen.Core.FunctionAttribute` attribute.
Then create a @AutoGen.Core.FunctionCallMiddleware and add the `WeatherReport` function to the middleware. The middleware will pass the `FunctionContract` to the agent when generating a response, and process the tool call response when receiving a `ToolCallMessage`.
## Create @AutoGen.OpenAI.OpenAIChatAgent with `GetWeatherReport` tool and chat with it
Because LiteLLM proxy server is openai-api compatible, we can use @AutoGen.OpenAI.OpenAIChatAgent to connect to it as a third-party openai-api provider. The agent is also registered with a @AutoGen.Core.FunctionCallMiddleware which contains the `WeatherReport` tool. Therefore, the agent can call the `WeatherReport` tool when generating a response.