autogen/python/packages/autogen-agentchat
Zhenyu 7c29704525
fix: ensure streaming chunks are immediately flushed to console (#6424)
Added `flush=True` to the `aprint` call when handling
`ModelClientStreamingChunkEvent` message to ensure each chunk is
immediately displayed as it arrives.

<!-- Thank you for your contribution! Please review
https://microsoft.github.io/autogen/docs/Contribute before opening a
pull request. -->

<!-- Please add a reviewer to the assignee section when you create a PR.
If you don't have the access to it, we will shortly find a reviewer and
assign them to your PR. -->

## Why are these changes needed?
When handling `ModelClientStreamingChunkEvent` message, streaming chunks
weren't guaranteed to be displayed immediately, as Python's stdout might
buffer output without an explicit flush instruction. This could cause
visual delays between when `chunk_event` objects are added to the
message queue and when users actually see the content rendered in the
console.
<!-- Please give a short summary of the change and the problem this
solves. -->

## Related issue number
None

<!-- For example: "Closes #1234" -->

## Checks

- [x] I've included any doc changes needed for
<https://microsoft.github.io/autogen/>. See
<https://github.com/microsoft/autogen/blob/main/CONTRIBUTING.md> to
build and test documentation locally.
- [x] I've added tests (if relevant) corresponding to the changes
introduced in this PR.
- [x] I've made sure all auto checks have passed.
2025-05-01 13:38:47 +00:00
..

AutoGen AgentChat

AgentChat is a high-level API for building multi-agent applications. It is built on top of the autogen-core package. For beginner users, AgentChat is the recommended starting point. For advanced users, autogen-core's event-driven programming model provides more flexibility and control over the underlying components.

AgentChat provides intuitive defaults, such as Agents with preset behaviors and Teams with predefined multi-agent design patterns.