mirror of
https://github.com/microsoft/autogen.git
synced 2025-08-09 01:02:39 +00:00

* update readme * update * update * update * update * update * update * add sample project * revert notebook change back * update * update interactive version * add nuget package * refactor Message * update example * add azure nightly build pipeline * Set up CI with Azure Pipelines [skip ci] * Update nightly-build.yml for Azure Pipelines * add dotnet interactive package * add dotnet interactive package * update pipeline * add nuget feed back * remove dotnet-tool feed * remove dotnet-tool feed comment * update pipeline * update build name * Update nightly-build.yml * Delete .github/workflows/dotnet-ci.yml * update * add working_dir to use step * add initateChat api * update oai package * Update dotnet-build.yml * Update dotnet-run-openai-test-and-notebooks.yml * update build workflow * update build workflow * update nuget feed * update nuget feed * update aoai and sk version * Update InteractiveService.cs * add support for GPT 4V * add DalleAndGPT4V example * update example * add user proxy agent * add readme * bump version * update example * add dotnet interactive hook * update * udpate tests * add website * update index.md * add docs * update doc * move sk dependency out of core package * udpate doc * Update Use-function-call.md * add type safe function call document * update doc * update doc * add dock * Update Use-function-call.md * add GenerateReplyOptions * remove IChatLLM * update version * update doc * update website * add sample * fix link * add middleware agent * clean up doc * bump version * update doc * update * add Other Language * remove warnings * add sign.props * add sign step * fix pipelien * auth * real sign * disable PR trigger * update * disable PR trigger * use microbuild machine * update build pipeline to add publish to internal feed * add internal feed * fix build pipeline * add dotnet prefix * update ci * add build number * update run number * update source * update token * update * remove adding source * add publish to github package * try again * try again * ask for write pacakge * disable package when branch is not main * update * implement streaming agent * add test for streaming function call * update * fix #1588 * enable PR check for dotnet branch * add website readme * only publish to dotnet feed when pushing to dotnet branch * remove openai-test-and-notebooks workflow * update readme * update readme * update workflow * update getting-start * upgrade test and sample proejct to use .net 8 * fix global.json format && make loadFromConfig API internal only before implementing * update * add support for LM studio * add doc * Update README.md * add push and workflow_dispatch trigger * disable PR for main * add dotnet env * Update Installation.md * add nuget * refer to newtonsoft 13 * update branch to dotnet in docfx * Update Installation.md * pull out HumanInputMiddleware and FunctionCallMiddleware * fix tests * add link to sample folder * refactor message * refactor over IMessage * add more tests * add more test * fix build error * rename header * add semantic kernel project * update sk example * update dotnet version * add LMStudio function call example * rename LLaMAFunctin * remove dotnet run openai test and notebook workflow * add FunctionContract and test * update doc * add documents * add workflow * update * update sample * fix warning in test * reult length can be less then maximumOutputToKeep (#1804) * merge with main * add option to retrieve inner agent and middlewares from MiddlewareAgent * update doc * adjust namespace * update readme * fix test * use IMessage * more updates * update * fix test * add comments * use FunctionContract to replace FunctionDefinition * move AutoGen contrac to AutoGen.Core * update installation * refactor streamingAgent by adding StreamingMessage type * update sample * update samples * update * update * add test * fix test * bump version * add openaichat test * update * Update Example03_Agent_FunctionCall.cs * [.Net] improve docs (#1862) * add doc * add doc * add doc * add doc * add doc * add doc * update * fix test error * fix some error * fix test * fix test * add more tests * edits --------- Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> * [.Net] Add fill form example (#1911) * add form filler example * update * fix ci error * [.Net] Add using AutoGen.Core in source generator (#1983) * fix using namespace bug in source generator * remove using in sourcegenerator test * disable PR test * Add .idea to .gitignore (#1988) * [.Net] publish to nuget.org feed (#1987) * publish to nuget * update ci * update dotnet-release * update release pipeline * add source * remove empty symbol package * update pipeline * remove tag * update installation guide * [.Net] Rename some classes && APIs based on doc review (#1980) * rename sequential group chat to round robin group chat * rename to sendInstruction * rename workflow to graph * rename some api * bump version * move Graph to GroupChat folder * rename fill application example * [.Net] Improve package description (#2161) * add discord link and update package description * Update getting-start.md * [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231) * update * rename RegisterPrintMessageHook to RegisterPrintMessage * update website * update update.md * fix link error * [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347) * update openai version && add sample for json output * add example in web * update update.md * update image url * [.Net] Add AutoGen.Mistral package (#2330) * add mstral client * enable streaming support * add mistralClientAgent * add test for function call * add extension * add support for toolcall and toolcall result message * add support for aggregate message * implement streaming function call * track (#2471) * [.Net] add mistral example (#2482) * update existing examples to use messageCOnnector * add overview * add function call document * add example 14 * add mistral token count usage example * update version * Update dotnet-release.yml (#2488) * update * revert gitattributes --------- Co-authored-by: mhensen <mh@webvize.nl> Co-authored-by: ekzhu <ekzhu@users.noreply.github.com> Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
104 lines
5.4 KiB
Markdown
104 lines
5.4 KiB
Markdown
### AutoGen for .NET
|
|
|
|
[](https://github.com/microsoft/autogen/actions/workflows/dotnet-build.yml)
|
|
[](https://badge.fury.io/nu/AutoGen.Core)
|
|
|
|
> [!NOTE]
|
|
> Nightly build is available at:
|
|
> -   : https://nuget.pkg.github.com/microsoft/index.json
|
|
> -   : https://www.myget.org/F/agentchat/api/v3/index.json
|
|
> -    : https://devdiv.pkgs.visualstudio.com/DevDiv/_packaging/AutoGen/nuget/v3/index.json
|
|
|
|
|
|
Firstly, following the [installation guide](./website/articles/Installation.md) to install AutoGen packages.
|
|
|
|
Then you can start with the following code snippet to create a conversable agent and chat with it.
|
|
|
|
```csharp
|
|
using AutoGen;
|
|
using AutoGen.OpenAI;
|
|
|
|
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
|
|
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");
|
|
|
|
var assistantAgent = new AssistantAgent(
|
|
name: "assistant",
|
|
systemMessage: "You are an assistant that help user to do some tasks.",
|
|
llmConfig: new ConversableAgentConfig
|
|
{
|
|
Temperature = 0,
|
|
ConfigList = [gpt35Config],
|
|
})
|
|
.RegisterPrintMessage(); // register a hook to print message nicely to console
|
|
|
|
// set human input mode to ALWAYS so that user always provide input
|
|
var userProxyAgent = new UserProxyAgent(
|
|
name: "user",
|
|
humanInputMode: ConversableAgent.HumanInputMode.ALWAYS)
|
|
.RegisterPrintMessage();
|
|
|
|
// start the conversation
|
|
await userProxyAgent.InitiateChatAsync(
|
|
receiver: assistantAgent,
|
|
message: "Hey assistant, please do me a favor.",
|
|
maxRound: 10);
|
|
```
|
|
|
|
#### Samples
|
|
You can find more examples under the [sample project](https://github.com/microsoft/autogen/tree/dotnet/dotnet/sample/AutoGen.BasicSamples).
|
|
|
|
#### Functionality
|
|
- ConversableAgent
|
|
- [x] function call
|
|
- [x] code execution (dotnet only, powered by [`dotnet-interactive`](https://github.com/dotnet/interactive))
|
|
|
|
- Agent communication
|
|
- [x] Two-agent chat
|
|
- [x] Group chat
|
|
|
|
- [ ] Enhanced LLM Inferences
|
|
|
|
- Exclusive for dotnet
|
|
- [x] Source generator for type-safe function definition generation
|
|
|
|
#### Update log
|
|
##### Update on 0.0.11 (2024-03-26)
|
|
- Add link to Discord channel in nuget's readme.md
|
|
- Document improvements
|
|
##### Update on 0.0.10 (2024-03-12)
|
|
- Rename `Workflow` to `Graph`
|
|
- Rename `AddInitializeMessage` to `SendIntroduction`
|
|
- Rename `SequentialGroupChat` to `RoundRobinGroupChat`
|
|
##### Update on 0.0.9 (2024-03-02)
|
|
- Refactor over @AutoGen.Message and introducing `TextMessage`, `ImageMessage`, `MultiModalMessage` and so on. PR [#1676](https://github.com/microsoft/autogen/pull/1676)
|
|
- Add `AutoGen.SemanticKernel` to support seamless integration with Semantic Kernel
|
|
- Move the agent contract abstraction to `AutoGen.Core` package. The `AutoGen.Core` package provides the abstraction for message type, agent and group chat and doesn't contain dependencies over `Azure.AI.OpenAI` or `Semantic Kernel`. This is useful when you want to leverage AutoGen's abstraction only and want to avoid introducing any other dependencies.
|
|
- Move `GPTAgent`, `OpenAIChatAgent` and all openai-dependencies to `AutoGen.OpenAI`
|
|
##### Update on 0.0.8 (2024-02-28)
|
|
- Fix [#1804](https://github.com/microsoft/autogen/pull/1804)
|
|
- Streaming support for IAgent [#1656](https://github.com/microsoft/autogen/pull/1656)
|
|
- Streaming support for middleware via `MiddlewareStreamingAgent` [#1656](https://github.com/microsoft/autogen/pull/1656)
|
|
- Graph chat support with conditional transition workflow [#1761](https://github.com/microsoft/autogen/pull/1761)
|
|
- AutoGen.SourceGenerator: Generate `FunctionContract` from `FunctionAttribute` [#1736](https://github.com/microsoft/autogen/pull/1736)
|
|
##### Update on 0.0.7 (2024-02-11)
|
|
- Add `AutoGen.LMStudio` to support comsume openai-like API from LMStudio local server
|
|
##### Update on 0.0.6 (2024-01-23)
|
|
- Add `MiddlewareAgent`
|
|
- Use `MiddlewareAgent` to implement existing agent hooks (RegisterPreProcess, RegisterPostProcess, RegisterReply)
|
|
- Remove `AutoReplyAgent`, `PreProcessAgent`, `PostProcessAgent` because they are replaced by `MiddlewareAgent`
|
|
##### Update on 0.0.5
|
|
- Simplify `IAgent` interface by removing `ChatLLM` Property
|
|
- Add `GenerateReplyOptions` to `IAgent.GenerateReplyAsync` which allows user to specify or override the options when generating reply
|
|
|
|
##### Update on 0.0.4
|
|
- Move out dependency of Semantic Kernel
|
|
- Add type `IChatLLM` as connector to LLM
|
|
|
|
##### Update on 0.0.3
|
|
- In AutoGen.SourceGenerator, rename FunctionAttribution to FunctionAttribute
|
|
- In AutoGen, refactor over ConversationAgent, UserProxyAgent, and AssistantAgent
|
|
|
|
##### Update on 0.0.2
|
|
- update Azure.OpenAI.AI to 1.0.0-beta.12
|
|
- update Semantic kernel to 1.0.1
|