2.5 KiB
In semantic kernel, a kernel plugin is a collection of kernel functions that can be invoked during LLM calls. Semantic kernel provides a list of built-in plugins, like core plugins, web search plugin and many more. You can also create your own plugins and use them in semantic kernel. Kernel plugins greatly extend the capabilities of semantic kernel and can be used to perform various tasks like web search, image search, text summarization, etc.
AutoGen.SemanticKernel
provides a middleware called @AutoGen.SemanticKernel.KernelPluginMiddleware that allows you to use semantic kernel plugins in other AutoGen agents like @AutoGen.OpenAI.OpenAIChatAgent. The following example shows how to define a simple plugin with a single GetWeather
function and use it in @AutoGen.OpenAI.OpenAIChatAgent.
Note
You can find the complete sample code here
Step 1: add using statement
Step 2: create plugin
In this step, we create a simple plugin with a single GetWeather
function that takes a location as input and returns the weather information for that location.
Step 3: create OpenAIChatAgent and use the plugin
In this step, we firstly create a @AutoGen.SemanticKernel.KernelPluginMiddleware and register the previous plugin with it. The KernelPluginMiddleware
will load the plugin and make the functions available for use in other agents. Followed by creating an @AutoGen.OpenAI.OpenAIChatAgent and register it with the KernelPluginMiddleware
.
Step 4: chat with OpenAIChatAgent
In this final step, we start the chat with the @AutoGen.OpenAI.OpenAIChatAgent by asking the weather in Seattle. The OpenAIChatAgent
will use the GetWeather
function from the plugin to get the weather information for Seattle.