Liu An ad1f89fea0
Fix: chat module update LLM defaults (#8125)
### What problem does this PR solve?

Previously when LLM.model_name was not configured:
- System incorrectly defaulted to 'deepseek-chat' model
- This caused permission errors for unauthorized tenants

Now:
- Use tenant's default chat_model configuration first

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
2025-06-09 11:44:02 +08:00
..
2025-06-05 12:04:09 +08:00
2025-01-21 20:52:28 +08:00
2025-01-14 11:49:43 +08:00
2025-05-23 18:25:47 +08:00

ragflow-sdk

build and publish python SDK to pypi.org

uv build
uv pip install twine
export TWINE_USERNAME="__token__"
export TWINE_PASSWORD=$YOUR_PYPI_API_TOKEN
twine upload dist/*.whl