autogen/dotnet/website/articles/Installation.md
Xiaoyun Zhang 600bd3f2fe
Bring Dotnet AutoGen (#924)
* update readme

* update

* update

* update

* update

* update

* update

* add sample project

* revert notebook change back

* update

* update interactive version

* add nuget package

* refactor Message

* update example

* add azure nightly build pipeline

* Set up CI with Azure Pipelines

[skip ci]

* Update nightly-build.yml for Azure Pipelines

* add dotnet interactive package

* add dotnet interactive package

* update pipeline

* add nuget feed back

* remove dotnet-tool feed

* remove dotnet-tool feed comment

* update pipeline

* update build name

* Update nightly-build.yml

* Delete .github/workflows/dotnet-ci.yml

* update

* add working_dir to use step

* add initateChat api

* update oai package

* Update dotnet-build.yml

* Update dotnet-run-openai-test-and-notebooks.yml

* update build workflow

* update build workflow

* update nuget feed

* update nuget feed

* update aoai and sk version

* Update InteractiveService.cs

* add support for GPT 4V

* add DalleAndGPT4V example

* update example

* add user proxy agent

* add readme

* bump version

* update example

* add dotnet interactive hook

* update

* udpate tests

* add website

* update index.md

* add docs

* update doc

* move sk dependency out of core package

* udpate doc

* Update Use-function-call.md

* add type safe function call document

* update doc

* update doc

* add dock

* Update Use-function-call.md

* add GenerateReplyOptions

* remove IChatLLM

* update version

* update doc

* update website

* add sample

* fix link

* add middleware agent

* clean up doc

* bump version

* update doc

* update

* add Other Language

* remove warnings

* add sign.props

* add sign step

* fix pipelien

* auth

* real sign

* disable PR trigger

* update

* disable PR trigger

* use microbuild machine

* update build pipeline to add publish to internal feed

* add internal feed

* fix build pipeline

* add dotnet prefix

* update ci

* add build number

* update run number

* update source

* update token

* update

* remove adding source

* add publish to github package

* try again

* try again

* ask for write pacakge

* disable package when branch is not main

* update

* implement streaming agent

* add test for streaming function call

* update

* fix #1588

* enable PR check for dotnet branch

* add website readme

* only publish to dotnet feed when pushing to dotnet branch

* remove openai-test-and-notebooks workflow

* update readme

* update readme

* update workflow

* update getting-start

* upgrade test and sample proejct to use .net 8

* fix global.json format && make loadFromConfig API internal only before implementing

* update

* add support for LM studio

* add doc

* Update README.md

* add push and workflow_dispatch trigger

* disable PR for main

* add dotnet env

* Update Installation.md

* add nuget

* refer to newtonsoft 13

* update branch to dotnet in docfx

* Update Installation.md

* pull out HumanInputMiddleware and FunctionCallMiddleware

* fix tests

* add link to sample folder

* refactor message

* refactor over IMessage

* add more tests

* add more test

* fix build error

* rename header

* add semantic kernel project

* update sk example

* update dotnet version

* add LMStudio function call example

* rename LLaMAFunctin

* remove dotnet run openai test and notebook workflow

* add FunctionContract and test

* update doc

* add documents

* add workflow

* update

* update sample

* fix warning in test

* reult length can be less then maximumOutputToKeep (#1804)

* merge with main

* add option to retrieve inner agent and middlewares from MiddlewareAgent

* update doc

* adjust namespace

* update readme

* fix test

* use IMessage

* more updates

* update

* fix test

* add comments

* use FunctionContract to replace FunctionDefinition

* move AutoGen contrac to AutoGen.Core

* update installation

* refactor streamingAgent by adding StreamingMessage type

* update sample

* update samples

* update

* update

* add test

* fix test

* bump version

* add openaichat test

* update

* Update Example03_Agent_FunctionCall.cs

* [.Net] improve docs (#1862)

* add doc

* add doc

* add doc

* add doc

* add doc

* add doc

* update

* fix test error

* fix some error

* fix test

* fix test

* add more tests

* edits

---------

Co-authored-by: ekzhu <ekzhu@users.noreply.github.com>

* [.Net] Add fill form example (#1911)

* add form filler example

* update

* fix ci error

* [.Net] Add using AutoGen.Core in source generator (#1983)

* fix using namespace bug in source generator

* remove using in sourcegenerator test

* disable PR test

* Add .idea to .gitignore (#1988)

* [.Net] publish to nuget.org feed (#1987)

* publish to nuget

* update ci

* update dotnet-release

* update release pipeline

* add source

* remove empty symbol package

* update pipeline

* remove tag

* update installation guide

* [.Net] Rename some classes && APIs based on doc review (#1980)

* rename sequential group chat to round robin group chat

* rename to sendInstruction

* rename workflow to graph

* rename some api

* bump version

* move Graph to GroupChat folder

* rename fill application example

* [.Net] Improve package description (#2161)

* add discord link and update package description

* Update getting-start.md

* [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231)

* update

* rename RegisterPrintMessageHook to RegisterPrintMessage

* update website

* update update.md

* fix link error

* [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347)

* update openai version && add sample for json output

* add example in web

* update update.md

* update image url

* [.Net] Add AutoGen.Mistral package (#2330)

* add mstral client

* enable streaming support

* add mistralClientAgent

* add test for function call

* add extension

* add support for toolcall and toolcall result message

* add support for aggregate message

* implement streaming function call

* track (#2471)

* [.Net] add mistral example (#2482)

* update existing examples to use messageCOnnector

* add overview

* add function call document

* add example 14

* add mistral token count usage example

* update version

* Update dotnet-release.yml (#2488)

* update

* revert gitattributes

---------

Co-authored-by: mhensen <mh@webvize.nl>
Co-authored-by: ekzhu <ekzhu@users.noreply.github.com>
Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
2024-04-26 16:21:46 +00:00

4.0 KiB

Current version:

NuGet version

AutoGen.Net provides the following packages, you can choose to install one or more of them based on your needs:

  • AutoGen: The one-in-all package. This package has dependencies over AutoGen.Core, AutoGen.OpenAI, AutoGen.LMStudio, AutoGen.SemanticKernel and AutoGen.SourceGenerator.
  • AutoGen.Core: The core package, this package provides the abstraction for message type, agent and group chat.
  • AutoGen.OpenAI: This package provides the integration agents over openai models.
  • AutoGen.Mistral: This package provides the integration agents for Mistral.AI models.
  • AutoGen.LMStudio: This package provides the integration agents from LM Studio.
  • AutoGen.SemanticKernel: This package provides the integration agents over semantic kernel.
  • AutoGen.SourceGenerator: This package carries a source generator that adds support for type-safe function definition generation.
  • AutoGen.DotnetInteractive: This packages carries dotnet interactive support to execute dotnet code snippet.

Note

Help me choose

  • If you just want to install one package and enjoy the core features of AutoGen, choose AutoGen.
  • If you want to leverage AutoGen's abstraction only and want to avoid introducing any other dependencies, like Azure.AI.OpenAI or Semantic Kernel, choose AutoGen.Core. You will need to implement your own agent, but you can still use AutoGen core features like group chat, built-in message type, workflow and middleware.
  • If you want to use AutoGen with openai, choose AutoGen.OpenAI, similarly, choose AutoGen.LMStudio or AutoGen.SemanticKernel if you want to use agents from LM Studio or semantic kernel.
  • If you just want the type-safe source generation for function call and don't want any other features, which even include the AutoGen's abstraction, choose AutoGen.SourceGenerator.

Then, install the package using the following command:

dotnet add package AUTOGEN_PACKAGES

Consume nightly build

To consume nightly build, you can add one of the following feeds to your NuGet.config or global nuget config:

To add a local NuGet.config, create a file named NuGet.config in the root of your project and add the following content:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <packageSources>
    <clear />
    <!-- dotnet-tools contains Microsoft.DotNet.Interactive.VisualStudio package, which is used by AutoGen.DotnetInteractive -->
    <add key="dotnet-tools" value="https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json" />
    <add key="AutoGen" value="$(FEED_URL)" /> <!-- replace $(FEED_URL) with the feed url -->
    <!-- other feeds -->
  </packageSources>
  <disabledPackageSources />
</configuration>

To add the feed to your global nuget config. You can do this by running the following command in your terminal:

dotnet nuget add source FEED_URL --name AutoGen

# dotnet-tools contains Microsoft.DotNet.Interactive.VisualStudio package, which is used by AutoGen.DotnetInteractive
dotnet nuget add source https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json --name dotnet-tools

Once you have added the feed, you can install the nightly-build package using the following command:

dotnet add package AUTOGEN_PACKAGES VERSION