mirror of
https://github.com/microsoft/autogen.git
synced 2025-10-29 00:39:49 +00:00
* Write release note for 0.2.1 & upgrade version prefix to 0.2.1 * update tom.yml * add example to release note Co-authored-by: Xiaoyun Zhang <bigmiao.zhang@gmail.com> --------- Co-authored-by: luongdavid <luongdavid@microsoft.com> Co-authored-by: Xiaoyun Zhang <bigmiao.zhang@gmail.com>
AutoGen for .NET
Note
Nightly build is available at:
Firstly, following the installation guide to install AutoGen packages.
Then you can start with the following code snippet to create a conversable agent and chat with it.
using AutoGen;
using AutoGen.OpenAI;
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35Config],
})
.RegisterPrintMessage(); // register a hook to print message nicely to console
// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: ConversableAgent.HumanInputMode.ALWAYS)
.RegisterPrintMessage();
// start the conversation
await userProxyAgent.InitiateChatAsync(
receiver: assistantAgent,
message: "Hey assistant, please do me a favor.",
maxRound: 10);
Samples
You can find more examples under the sample project.
Functionality
-
ConversableAgent
- function call
- code execution (dotnet only, powered by
dotnet-interactive)
-
Agent communication
- Two-agent chat
- Group chat
-
Enhanced LLM Inferences
-
Exclusive for dotnet
- Source generator for type-safe function definition generation
Update log
Update on 0.0.11 (2024-03-26)
- Add link to Discord channel in nuget's readme.md
- Document improvements
Update on 0.0.10 (2024-03-12)
- Rename
WorkflowtoGraph - Rename
AddInitializeMessagetoSendIntroduction - Rename
SequentialGroupChattoRoundRobinGroupChat
Update on 0.0.9 (2024-03-02)
- Refactor over @AutoGen.Message and introducing
TextMessage,ImageMessage,MultiModalMessageand so on. PR #1676 - Add
AutoGen.SemanticKernelto support seamless integration with Semantic Kernel - Move the agent contract abstraction to
AutoGen.Corepackage. TheAutoGen.Corepackage provides the abstraction for message type, agent and group chat and doesn't contain dependencies overAzure.AI.OpenAIorSemantic Kernel. This is useful when you want to leverage AutoGen's abstraction only and want to avoid introducing any other dependencies. - Move
GPTAgent,OpenAIChatAgentand all openai-dependencies toAutoGen.OpenAI
Update on 0.0.8 (2024-02-28)
- Fix #1804
- Streaming support for IAgent #1656
- Streaming support for middleware via
MiddlewareStreamingAgent#1656 - Graph chat support with conditional transition workflow #1761
- AutoGen.SourceGenerator: Generate
FunctionContractfromFunctionAttribute#1736
Update on 0.0.7 (2024-02-11)
- Add
AutoGen.LMStudioto support comsume openai-like API from LMStudio local server
Update on 0.0.6 (2024-01-23)
- Add
MiddlewareAgent - Use
MiddlewareAgentto implement existing agent hooks (RegisterPreProcess, RegisterPostProcess, RegisterReply) - Remove
AutoReplyAgent,PreProcessAgent,PostProcessAgentbecause they are replaced byMiddlewareAgent
Update on 0.0.5
- Simplify
IAgentinterface by removingChatLLMProperty - Add
GenerateReplyOptionstoIAgent.GenerateReplyAsyncwhich allows user to specify or override the options when generating reply
Update on 0.0.4
- Move out dependency of Semantic Kernel
- Add type
IChatLLMas connector to LLM
Update on 0.0.3
- In AutoGen.SourceGenerator, rename FunctionAttribution to FunctionAttribute
- In AutoGen, refactor over ConversationAgent, UserProxyAgent, and AssistantAgent
Update on 0.0.2
- update Azure.OpenAI.AI to 1.0.0-beta.12
- update Semantic kernel to 1.0.1