
## Why are these changes needed? This pull request adds new samples that integrates the Autogen Core API with Chainlit. It closely follows the structure of the Agentchat+Chainlit sample and provides examples for using a single agent and multiple agents in a groupchat. ## Related issue number Closes: #5345 --------- Co-authored-by: Eric Zhu <ekzhu@users.noreply.github.com>
Core ChainLit Integration Sample
In this sample, we will demonstrate how to build simple chat interface that interacts with a Core agent or a team, using Chainlit, and support streaming messages.
Overview
The core_chainlit
sample is designed to illustrate a simple use case of ChainLit integrated with a single-threaded agent runtime. It includes the following components:
- Single Agent: A single agent that operates within the ChainLit environment.
- Group Chat: A group chat setup featuring two agents:
- Assistant Agent: This agent responds to user inputs.
- Critic Agent: This agent reflects on and critiques the responses from the Assistant Agent.
- Closure Agent: Utilizes a closure agent to aggregate output messages into an output queue.
- Token Streaming: Demonstrates how to stream tokens to the user interface.
- Session Management: Manages the runtime and output queue within the ChainLit user session.
Requirements
To run this sample, you will need:
- Python 3.8 or higher
- Installation of necessary Python packages as listed in
requirements.txt
Installation
To run this sample, you will need to install the following packages:
pip install -U chainlit autogen-core autogen-ext[openai] pyyaml
To use other model providers, you will need to install a different extra
for the autogen-ext
package.
See the Models documentation for more information.
Model Configuration
Create a configuration file named model_config.yaml
to configure the model
you want to use. Use model_config_template.yaml
as a template.
Running the Agent Sample
The first sample demonstrate how to interact with a single AssistantAgent from the chat interface. Note: cd to the sample directory.
chainlit run app_agent.py
Running the Team Sample
The second sample demonstrate how to interact with a team of agents from the chat interface.
chainlit run app_team.py -h
There are two agents in the team: one is instructed to be generally helpful and the other one is instructed to be a critic and provide feedback.