autogen/dotnet/samples/AutoGen.BasicSamples/Example10_SemanticKernel.cs

81 lines
3.3 KiB
C#
Raw Permalink Normal View History

2024-09-30 16:32:48 -07:00
// Copyright (c) Microsoft Corporation. All rights reserved.
Bring Dotnet AutoGen (#924) * update readme * update * update * update * update * update * update * add sample project * revert notebook change back * update * update interactive version * add nuget package * refactor Message * update example * add azure nightly build pipeline * Set up CI with Azure Pipelines [skip ci] * Update nightly-build.yml for Azure Pipelines * add dotnet interactive package * add dotnet interactive package * update pipeline * add nuget feed back * remove dotnet-tool feed * remove dotnet-tool feed comment * update pipeline * update build name * Update nightly-build.yml * Delete .github/workflows/dotnet-ci.yml * update * add working_dir to use step * add initateChat api * update oai package * Update dotnet-build.yml * Update dotnet-run-openai-test-and-notebooks.yml * update build workflow * update build workflow * update nuget feed * update nuget feed * update aoai and sk version * Update InteractiveService.cs * add support for GPT 4V * add DalleAndGPT4V example * update example * add user proxy agent * add readme * bump version * update example * add dotnet interactive hook * update * udpate tests * add website * update index.md * add docs * update doc * move sk dependency out of core package * udpate doc * Update Use-function-call.md * add type safe function call document * update doc * update doc * add dock * Update Use-function-call.md * add GenerateReplyOptions * remove IChatLLM * update version * update doc * update website * add sample * fix link * add middleware agent * clean up doc * bump version * update doc * update * add Other Language * remove warnings * add sign.props * add sign step * fix pipelien * auth * real sign * disable PR trigger * update * disable PR trigger * use microbuild machine * update build pipeline to add publish to internal feed * add internal feed * fix build pipeline * add dotnet prefix * update ci * add build number * update run number * update source * update token * update * remove adding source * add publish to github package * try again * try again * ask for write pacakge * disable package when branch is not main * update * implement streaming agent * add test for streaming function call * update * fix #1588 * enable PR check for dotnet branch * add website readme * only publish to dotnet feed when pushing to dotnet branch * remove openai-test-and-notebooks workflow * update readme * update readme * update workflow * update getting-start * upgrade test and sample proejct to use .net 8 * fix global.json format && make loadFromConfig API internal only before implementing * update * add support for LM studio * add doc * Update README.md * add push and workflow_dispatch trigger * disable PR for main * add dotnet env * Update Installation.md * add nuget * refer to newtonsoft 13 * update branch to dotnet in docfx * Update Installation.md * pull out HumanInputMiddleware and FunctionCallMiddleware * fix tests * add link to sample folder * refactor message * refactor over IMessage * add more tests * add more test * fix build error * rename header * add semantic kernel project * update sk example * update dotnet version * add LMStudio function call example * rename LLaMAFunctin * remove dotnet run openai test and notebook workflow * add FunctionContract and test * update doc * add documents * add workflow * update * update sample * fix warning in test * reult length can be less then maximumOutputToKeep (#1804) * merge with main * add option to retrieve inner agent and middlewares from MiddlewareAgent * update doc * adjust namespace * update readme * fix test * use IMessage * more updates * update * fix test * add comments * use FunctionContract to replace FunctionDefinition * move AutoGen contrac to AutoGen.Core * update installation * refactor streamingAgent by adding StreamingMessage type * update sample * update samples * update * update * add test * fix test * bump version * add openaichat test * update * Update Example03_Agent_FunctionCall.cs * [.Net] improve docs (#1862) * add doc * add doc * add doc * add doc * add doc * add doc * update * fix test error * fix some error * fix test * fix test * add more tests * edits --------- Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> * [.Net] Add fill form example (#1911) * add form filler example * update * fix ci error * [.Net] Add using AutoGen.Core in source generator (#1983) * fix using namespace bug in source generator * remove using in sourcegenerator test * disable PR test * Add .idea to .gitignore (#1988) * [.Net] publish to nuget.org feed (#1987) * publish to nuget * update ci * update dotnet-release * update release pipeline * add source * remove empty symbol package * update pipeline * remove tag * update installation guide * [.Net] Rename some classes && APIs based on doc review (#1980) * rename sequential group chat to round robin group chat * rename to sendInstruction * rename workflow to graph * rename some api * bump version * move Graph to GroupChat folder * rename fill application example * [.Net] Improve package description (#2161) * add discord link and update package description * Update getting-start.md * [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231) * update * rename RegisterPrintMessageHook to RegisterPrintMessage * update website * update update.md * fix link error * [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347) * update openai version && add sample for json output * add example in web * update update.md * update image url * [.Net] Add AutoGen.Mistral package (#2330) * add mstral client * enable streaming support * add mistralClientAgent * add test for function call * add extension * add support for toolcall and toolcall result message * add support for aggregate message * implement streaming function call * track (#2471) * [.Net] add mistral example (#2482) * update existing examples to use messageCOnnector * add overview * add function call document * add example 14 * add mistral token count usage example * update version * Update dotnet-release.yml (#2488) * update * revert gitattributes --------- Co-authored-by: mhensen <mh@webvize.nl> Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
2024-04-26 09:21:46 -07:00
// Example10_SemanticKernel.cs
using System.ComponentModel;
using AutoGen.Core;
using AutoGen.SemanticKernel.Extension;
using FluentAssertions;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
namespace AutoGen.BasicSample;
public class LightPlugin
{
2024-09-30 16:32:48 -07:00
public bool IsOn { get; set; }
Bring Dotnet AutoGen (#924) * update readme * update * update * update * update * update * update * add sample project * revert notebook change back * update * update interactive version * add nuget package * refactor Message * update example * add azure nightly build pipeline * Set up CI with Azure Pipelines [skip ci] * Update nightly-build.yml for Azure Pipelines * add dotnet interactive package * add dotnet interactive package * update pipeline * add nuget feed back * remove dotnet-tool feed * remove dotnet-tool feed comment * update pipeline * update build name * Update nightly-build.yml * Delete .github/workflows/dotnet-ci.yml * update * add working_dir to use step * add initateChat api * update oai package * Update dotnet-build.yml * Update dotnet-run-openai-test-and-notebooks.yml * update build workflow * update build workflow * update nuget feed * update nuget feed * update aoai and sk version * Update InteractiveService.cs * add support for GPT 4V * add DalleAndGPT4V example * update example * add user proxy agent * add readme * bump version * update example * add dotnet interactive hook * update * udpate tests * add website * update index.md * add docs * update doc * move sk dependency out of core package * udpate doc * Update Use-function-call.md * add type safe function call document * update doc * update doc * add dock * Update Use-function-call.md * add GenerateReplyOptions * remove IChatLLM * update version * update doc * update website * add sample * fix link * add middleware agent * clean up doc * bump version * update doc * update * add Other Language * remove warnings * add sign.props * add sign step * fix pipelien * auth * real sign * disable PR trigger * update * disable PR trigger * use microbuild machine * update build pipeline to add publish to internal feed * add internal feed * fix build pipeline * add dotnet prefix * update ci * add build number * update run number * update source * update token * update * remove adding source * add publish to github package * try again * try again * ask for write pacakge * disable package when branch is not main * update * implement streaming agent * add test for streaming function call * update * fix #1588 * enable PR check for dotnet branch * add website readme * only publish to dotnet feed when pushing to dotnet branch * remove openai-test-and-notebooks workflow * update readme * update readme * update workflow * update getting-start * upgrade test and sample proejct to use .net 8 * fix global.json format && make loadFromConfig API internal only before implementing * update * add support for LM studio * add doc * Update README.md * add push and workflow_dispatch trigger * disable PR for main * add dotnet env * Update Installation.md * add nuget * refer to newtonsoft 13 * update branch to dotnet in docfx * Update Installation.md * pull out HumanInputMiddleware and FunctionCallMiddleware * fix tests * add link to sample folder * refactor message * refactor over IMessage * add more tests * add more test * fix build error * rename header * add semantic kernel project * update sk example * update dotnet version * add LMStudio function call example * rename LLaMAFunctin * remove dotnet run openai test and notebook workflow * add FunctionContract and test * update doc * add documents * add workflow * update * update sample * fix warning in test * reult length can be less then maximumOutputToKeep (#1804) * merge with main * add option to retrieve inner agent and middlewares from MiddlewareAgent * update doc * adjust namespace * update readme * fix test * use IMessage * more updates * update * fix test * add comments * use FunctionContract to replace FunctionDefinition * move AutoGen contrac to AutoGen.Core * update installation * refactor streamingAgent by adding StreamingMessage type * update sample * update samples * update * update * add test * fix test * bump version * add openaichat test * update * Update Example03_Agent_FunctionCall.cs * [.Net] improve docs (#1862) * add doc * add doc * add doc * add doc * add doc * add doc * update * fix test error * fix some error * fix test * fix test * add more tests * edits --------- Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> * [.Net] Add fill form example (#1911) * add form filler example * update * fix ci error * [.Net] Add using AutoGen.Core in source generator (#1983) * fix using namespace bug in source generator * remove using in sourcegenerator test * disable PR test * Add .idea to .gitignore (#1988) * [.Net] publish to nuget.org feed (#1987) * publish to nuget * update ci * update dotnet-release * update release pipeline * add source * remove empty symbol package * update pipeline * remove tag * update installation guide * [.Net] Rename some classes && APIs based on doc review (#1980) * rename sequential group chat to round robin group chat * rename to sendInstruction * rename workflow to graph * rename some api * bump version * move Graph to GroupChat folder * rename fill application example * [.Net] Improve package description (#2161) * add discord link and update package description * Update getting-start.md * [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231) * update * rename RegisterPrintMessageHook to RegisterPrintMessage * update website * update update.md * fix link error * [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347) * update openai version && add sample for json output * add example in web * update update.md * update image url * [.Net] Add AutoGen.Mistral package (#2330) * add mstral client * enable streaming support * add mistralClientAgent * add test for function call * add extension * add support for toolcall and toolcall result message * add support for aggregate message * implement streaming function call * track (#2471) * [.Net] add mistral example (#2482) * update existing examples to use messageCOnnector * add overview * add function call document * add example 14 * add mistral token count usage example * update version * Update dotnet-release.yml (#2488) * update * revert gitattributes --------- Co-authored-by: mhensen <mh@webvize.nl> Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
2024-04-26 09:21:46 -07:00
[KernelFunction]
[Description("Gets the state of the light.")]
public string GetState() => this.IsOn ? "on" : "off";
[KernelFunction]
[Description("Changes the state of the light.'")]
public string ChangeState(bool newState)
{
this.IsOn = newState;
var state = this.GetState();
// Print the state to the console
Console.ForegroundColor = ConsoleColor.DarkBlue;
Console.WriteLine($"[Light is now {state}]");
Console.ResetColor();
return state;
}
}
public class Example10_SemanticKernel
{
public static async Task RunAsync()
{
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-4o-mini";
Bring Dotnet AutoGen (#924) * update readme * update * update * update * update * update * update * add sample project * revert notebook change back * update * update interactive version * add nuget package * refactor Message * update example * add azure nightly build pipeline * Set up CI with Azure Pipelines [skip ci] * Update nightly-build.yml for Azure Pipelines * add dotnet interactive package * add dotnet interactive package * update pipeline * add nuget feed back * remove dotnet-tool feed * remove dotnet-tool feed comment * update pipeline * update build name * Update nightly-build.yml * Delete .github/workflows/dotnet-ci.yml * update * add working_dir to use step * add initateChat api * update oai package * Update dotnet-build.yml * Update dotnet-run-openai-test-and-notebooks.yml * update build workflow * update build workflow * update nuget feed * update nuget feed * update aoai and sk version * Update InteractiveService.cs * add support for GPT 4V * add DalleAndGPT4V example * update example * add user proxy agent * add readme * bump version * update example * add dotnet interactive hook * update * udpate tests * add website * update index.md * add docs * update doc * move sk dependency out of core package * udpate doc * Update Use-function-call.md * add type safe function call document * update doc * update doc * add dock * Update Use-function-call.md * add GenerateReplyOptions * remove IChatLLM * update version * update doc * update website * add sample * fix link * add middleware agent * clean up doc * bump version * update doc * update * add Other Language * remove warnings * add sign.props * add sign step * fix pipelien * auth * real sign * disable PR trigger * update * disable PR trigger * use microbuild machine * update build pipeline to add publish to internal feed * add internal feed * fix build pipeline * add dotnet prefix * update ci * add build number * update run number * update source * update token * update * remove adding source * add publish to github package * try again * try again * ask for write pacakge * disable package when branch is not main * update * implement streaming agent * add test for streaming function call * update * fix #1588 * enable PR check for dotnet branch * add website readme * only publish to dotnet feed when pushing to dotnet branch * remove openai-test-and-notebooks workflow * update readme * update readme * update workflow * update getting-start * upgrade test and sample proejct to use .net 8 * fix global.json format && make loadFromConfig API internal only before implementing * update * add support for LM studio * add doc * Update README.md * add push and workflow_dispatch trigger * disable PR for main * add dotnet env * Update Installation.md * add nuget * refer to newtonsoft 13 * update branch to dotnet in docfx * Update Installation.md * pull out HumanInputMiddleware and FunctionCallMiddleware * fix tests * add link to sample folder * refactor message * refactor over IMessage * add more tests * add more test * fix build error * rename header * add semantic kernel project * update sk example * update dotnet version * add LMStudio function call example * rename LLaMAFunctin * remove dotnet run openai test and notebook workflow * add FunctionContract and test * update doc * add documents * add workflow * update * update sample * fix warning in test * reult length can be less then maximumOutputToKeep (#1804) * merge with main * add option to retrieve inner agent and middlewares from MiddlewareAgent * update doc * adjust namespace * update readme * fix test * use IMessage * more updates * update * fix test * add comments * use FunctionContract to replace FunctionDefinition * move AutoGen contrac to AutoGen.Core * update installation * refactor streamingAgent by adding StreamingMessage type * update sample * update samples * update * update * add test * fix test * bump version * add openaichat test * update * Update Example03_Agent_FunctionCall.cs * [.Net] improve docs (#1862) * add doc * add doc * add doc * add doc * add doc * add doc * update * fix test error * fix some error * fix test * fix test * add more tests * edits --------- Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> * [.Net] Add fill form example (#1911) * add form filler example * update * fix ci error * [.Net] Add using AutoGen.Core in source generator (#1983) * fix using namespace bug in source generator * remove using in sourcegenerator test * disable PR test * Add .idea to .gitignore (#1988) * [.Net] publish to nuget.org feed (#1987) * publish to nuget * update ci * update dotnet-release * update release pipeline * add source * remove empty symbol package * update pipeline * remove tag * update installation guide * [.Net] Rename some classes && APIs based on doc review (#1980) * rename sequential group chat to round robin group chat * rename to sendInstruction * rename workflow to graph * rename some api * bump version * move Graph to GroupChat folder * rename fill application example * [.Net] Improve package description (#2161) * add discord link and update package description * Update getting-start.md * [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231) * update * rename RegisterPrintMessageHook to RegisterPrintMessage * update website * update update.md * fix link error * [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347) * update openai version && add sample for json output * add example in web * update update.md * update image url * [.Net] Add AutoGen.Mistral package (#2330) * add mstral client * enable streaming support * add mistralClientAgent * add test for function call * add extension * add support for toolcall and toolcall result message * add support for aggregate message * implement streaming function call * track (#2471) * [.Net] add mistral example (#2482) * update existing examples to use messageCOnnector * add overview * add function call document * add example 14 * add mistral token count usage example * update version * Update dotnet-release.yml (#2488) * update * revert gitattributes --------- Co-authored-by: mhensen <mh@webvize.nl> Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
2024-04-26 09:21:46 -07:00
var builder = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: modelId, apiKey: openAIKey);
var kernel = builder.Build();
var settings = new OpenAIPromptExecutionSettings
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions,
};
kernel.Plugins.AddFromObject(new LightPlugin());
var skAgent = kernel
.ToSemanticKernelAgent(name: "assistant", systemMessage: "You control the light", settings);
// Send a message to the skAgent, the skAgent supports the following message types:
// - IMessage<ChatMessageContent>
// - (streaming) IMessage<StreamingChatMessageContent>
// You can create an IMessage<ChatMessageContent> using MessageEnvelope.Create
var chatMessageContent = MessageEnvelope.Create(new ChatMessageContent(AuthorRole.User, "Toggle the light"));
var reply = await skAgent.SendAsync(chatMessageContent);
reply.Should().BeOfType<MessageEnvelope<ChatMessageContent>>();
Console.WriteLine((reply as IMessage<ChatMessageContent>).Content.Items[0].As<TextContent>().Text);
var skAgentWithMiddleware = skAgent
.RegisterMessageConnector() // Register the message connector to support more AutoGen built-in message types
Bring Dotnet AutoGen (#924) * update readme * update * update * update * update * update * update * add sample project * revert notebook change back * update * update interactive version * add nuget package * refactor Message * update example * add azure nightly build pipeline * Set up CI with Azure Pipelines [skip ci] * Update nightly-build.yml for Azure Pipelines * add dotnet interactive package * add dotnet interactive package * update pipeline * add nuget feed back * remove dotnet-tool feed * remove dotnet-tool feed comment * update pipeline * update build name * Update nightly-build.yml * Delete .github/workflows/dotnet-ci.yml * update * add working_dir to use step * add initateChat api * update oai package * Update dotnet-build.yml * Update dotnet-run-openai-test-and-notebooks.yml * update build workflow * update build workflow * update nuget feed * update nuget feed * update aoai and sk version * Update InteractiveService.cs * add support for GPT 4V * add DalleAndGPT4V example * update example * add user proxy agent * add readme * bump version * update example * add dotnet interactive hook * update * udpate tests * add website * update index.md * add docs * update doc * move sk dependency out of core package * udpate doc * Update Use-function-call.md * add type safe function call document * update doc * update doc * add dock * Update Use-function-call.md * add GenerateReplyOptions * remove IChatLLM * update version * update doc * update website * add sample * fix link * add middleware agent * clean up doc * bump version * update doc * update * add Other Language * remove warnings * add sign.props * add sign step * fix pipelien * auth * real sign * disable PR trigger * update * disable PR trigger * use microbuild machine * update build pipeline to add publish to internal feed * add internal feed * fix build pipeline * add dotnet prefix * update ci * add build number * update run number * update source * update token * update * remove adding source * add publish to github package * try again * try again * ask for write pacakge * disable package when branch is not main * update * implement streaming agent * add test for streaming function call * update * fix #1588 * enable PR check for dotnet branch * add website readme * only publish to dotnet feed when pushing to dotnet branch * remove openai-test-and-notebooks workflow * update readme * update readme * update workflow * update getting-start * upgrade test and sample proejct to use .net 8 * fix global.json format && make loadFromConfig API internal only before implementing * update * add support for LM studio * add doc * Update README.md * add push and workflow_dispatch trigger * disable PR for main * add dotnet env * Update Installation.md * add nuget * refer to newtonsoft 13 * update branch to dotnet in docfx * Update Installation.md * pull out HumanInputMiddleware and FunctionCallMiddleware * fix tests * add link to sample folder * refactor message * refactor over IMessage * add more tests * add more test * fix build error * rename header * add semantic kernel project * update sk example * update dotnet version * add LMStudio function call example * rename LLaMAFunctin * remove dotnet run openai test and notebook workflow * add FunctionContract and test * update doc * add documents * add workflow * update * update sample * fix warning in test * reult length can be less then maximumOutputToKeep (#1804) * merge with main * add option to retrieve inner agent and middlewares from MiddlewareAgent * update doc * adjust namespace * update readme * fix test * use IMessage * more updates * update * fix test * add comments * use FunctionContract to replace FunctionDefinition * move AutoGen contrac to AutoGen.Core * update installation * refactor streamingAgent by adding StreamingMessage type * update sample * update samples * update * update * add test * fix test * bump version * add openaichat test * update * Update Example03_Agent_FunctionCall.cs * [.Net] improve docs (#1862) * add doc * add doc * add doc * add doc * add doc * add doc * update * fix test error * fix some error * fix test * fix test * add more tests * edits --------- Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> * [.Net] Add fill form example (#1911) * add form filler example * update * fix ci error * [.Net] Add using AutoGen.Core in source generator (#1983) * fix using namespace bug in source generator * remove using in sourcegenerator test * disable PR test * Add .idea to .gitignore (#1988) * [.Net] publish to nuget.org feed (#1987) * publish to nuget * update ci * update dotnet-release * update release pipeline * add source * remove empty symbol package * update pipeline * remove tag * update installation guide * [.Net] Rename some classes && APIs based on doc review (#1980) * rename sequential group chat to round robin group chat * rename to sendInstruction * rename workflow to graph * rename some api * bump version * move Graph to GroupChat folder * rename fill application example * [.Net] Improve package description (#2161) * add discord link and update package description * Update getting-start.md * [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231) * update * rename RegisterPrintMessageHook to RegisterPrintMessage * update website * update update.md * fix link error * [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347) * update openai version && add sample for json output * add example in web * update update.md * update image url * [.Net] Add AutoGen.Mistral package (#2330) * add mstral client * enable streaming support * add mistralClientAgent * add test for function call * add extension * add support for toolcall and toolcall result message * add support for aggregate message * implement streaming function call * track (#2471) * [.Net] add mistral example (#2482) * update existing examples to use messageCOnnector * add overview * add function call document * add example 14 * add mistral token count usage example * update version * Update dotnet-release.yml (#2488) * update * revert gitattributes --------- Co-authored-by: mhensen <mh@webvize.nl> Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
2024-04-26 09:21:46 -07:00
.RegisterPrintMessage();
// Now the skAgentWithMiddleware supports more IMessage types like TextMessage, ImageMessage or MultiModalMessage
// It also register a print format message hook to print the message in a human readable format to the console
await skAgent.SendAsync(chatMessageContent);
await skAgentWithMiddleware.SendAsync(new TextMessage(Role.User, "Toggle the light"));
// The more message type an agent support, the more flexible it is to be used in different scenarios
// For example, since the TextMessage is supported, the skAgentWithMiddleware can be used with user proxy.
var userProxy = new UserProxyAgent("user");
await skAgentWithMiddleware.InitiateChatAsync(userProxy, "how can I help you today");
}
}